Machine Vision

Embedded Vision Summit: Where Engineers Connect Directly

17.07.2024 - Interview with Jeff Bier, Founder and CEO of Edge AI and Vision Alliance

Get a sneak peek at what‘s coming up at Embedded Vision Summit 2024. Jeff Bier discusses the exciting lineup, including groundbreaking presentations about embedded vision, panel discussions on AI and edge computing, and over 75+ exhibitors showcasing cutting-edge machine vision technologies.
 

inspect: What highlights can visitors expect this year?

Jeff Bier: Attendees can look forward to a program packed with exciting speakers and topics, a bustling show floor with over 75+ exhibitors, a Deep Dive Session hosted by Qualcomm, a live start-up competition and a fascinating keynote address. 

Our keynote session, “Learning to Understand Our Multimodal World with Minimal Supervision,” will be presented by Professor Yong Jae Lee of the University of Wisconsin-Madison. He’ll be sharing his groundbreaking research on creating deep learning models that comprehend the world with minimal task-specific training, and the benefit from using images and text—as well as  other modalities like video, audio and LiDAR. 

We’ll also be hosting a panel on generative AI’s impact on edge AI and machine perception, kind of a part two of last year‘s session (check out a recording of last year’s session). Industry leaders will lend their perspective to questions like: How do recent advancements in generative AI alter the landscape for discriminative AI models, like those in machine perception? Can generative AI eliminate the need for extensive hand-labeled training data and expedite the fusion of diverse data types? With generative models boasting over 100 billion parameters, can they be deployed at the edge? 

Two General Sessions are also on the agenda. The first is “Scaling Vision-Based Edge AI Solutions: From Prototype to Global Deployment” given by Maurits Kaptein of Network Optix, covering how to overcome the networking, fleet management, visualization and monetization challenges that come with scaling a global vision solution. The second will be from Jilei Hou of Qualcomm. Jilei will lay out the case for why these large models must be deployed on edge devices, and Qualcomm’s vision for how we can overcome the key challenges standing in the way of doing so.

We’re also looking forward to the annual Women in Vision Reception—an event that brings together women working in computer vision and edge AI to meet, network and share ideas.

inspect: What is new compared to the Embedded Vision Summit 2023?

Jeff: Like in years past, we categorize the sessions (see them all here) at the Summit into four tracks: Enabling Technologies, Technical Insights, Fundamentals and Business Insights. Some of the most interesting themes covered in the tracks are: 

From model to product: Almost every product incorporating AI or vision starts with an AI model. And the latest models tend to get all the buzz – whose object recognition model is fastest, that sort of thing.  But the industry has learned over the last few years that you need much more than a model to have an actual product. For example, there’s non-neural-network computer vision and image processing to be done. There’s dealing with model updates. There’s real-world performance to be monitored and measured and fed back to the product team. This year we’ve got a number of great talks focused on what it takes to bring AI-based products to market. 

Generative AI’s impact on edge AI and machine perception: The integration of generative AI into edge AI and machine perception systems holds promise for improving adaptability, accuracy, robustness and efficiency across a wide range of applications, advancing the capabilities of edge devices to process and understand complex real-world data. Many of our speakers and exhibitors will be focusing on how things like vision language models and large multimodal models work, and how they’re been using these new techniques. 
Transitioning computing from the cloud to the edge: As the demand for real-time, real-world AI applications continues to rise, the shift from cloud computing to edge deployment is gaining momentum. This transition offers numerous advantages: reduced bandwidth and latency, improved economics, increased reliability, and enhanced privacy and security. By bringing AI capabilities closer to the point of use, companies can unlock new opportunities for innovation and deliver more seamless user experiences. 


inspect: How many exhibitors will be on site? What will visitors see here?

Jeff: Every year the Technology Exhibits are my favorite part of the event—there’s this sense of excitement and innovation. We’ll be hosting 75+ sponsors and exhi­bitors at this year’s Summit. Every building block technology will be represented—from sensors and lenses and camera modules to processors and accelerators to software tools and libraries, it’s all there. Attendees can expect to see demos on technology that utilizes processors, algorithms, software, sensors, development tools, services and more – all to enable vision- and AI-based capabilities that can be deployed to solve real-world problems.

As an engineer, the thing I love about it is that it’s one of the few places where you can talk directly with the engineers who designed the things you’re looking at, and get your questions answered in a candid fashion.


inspect: Why should you visit the Embedded Vision Summit 2024?

Jeff: From the very first Summit, we’ve prioritized the best content, contacts and technology. That tradition continues 13 years later – let me break it down a little more: 

Best content: Our attendees get insights from people who are expert practitioners in the industry and who have proven track records in computer vision and perceptual AI. They not only use this technology, but they live and breathe it. 

Best contacts: We attract top-notch attendees and exhibitors. That gives attendees the opportunity to meet more game-changing partners and contacts in one place.
Best technology: We are dedicated to making sure we and our exhibitors bring the most relevant technologies that will help our attendees reach their goals.

Author
David Löh, Editor-in-Chief of inspect

Contact

Edge AI and Vision Alliance

1646 North California Blvd., Suite 220
94596 Walnut Creek
California, United States

+1 925 954 14 11
+1 925 954 14 23

Digital tools or software can ease your life as a photonics professional by either helping you with your system design or during the manufacturing process or when purchasing components. Check out our compilation:

Proceed to our dossier

inspect award 2024

The voting for the inspect award 2024 is open.

Vote now!

Digital tools or software can ease your life as a photonics professional by either helping you with your system design or during the manufacturing process or when purchasing components. Check out our compilation:

Proceed to our dossier

inspect award 2024

The voting for the inspect award 2024 is open.

Vote now!