No results found for “

Flowing Forward: The Impact of IoT on Plumbing

Flowing-Forward---The-Impact-of-IoT-on-Plumbing---Water-leak

The intersection between the Internet of Things (IoT) and artificial intelligence (AI) is transforming homes into smart ecosystems where connected devices communicate and automate tasks to enhance convenience. That smart evolution is coming to plumbing as well. 

IoT is reshaping the plumbing industry, bringing forth a wave of innovative solutions for homeowners just in time for the winter season. Here, we’ll explore how manufacturers are implementing these technologies in plumbing, their benefits, and the problems they solve. Additionally, we’ll highlight some cutting-edge smart devices and sensors leading the charge in this transformative journey. 

Leak Detection and Prevention 

Water damage is one of the most common causes of home insurance claims and affects about 14,000 people a day in the United States1. Undetected leaks can lead to a host of problems for homeowners, including water damage, mold, higher water bills, and electrical damage. 

Manufacturers are building IoT-enabled smart water leak detectors to identify leaks before they cause substantial damage. Moen’s Smart Leak Detector, chosen by Wired magazine as the best overall water leak detector2, is a standalone sensor that homeowners can place anywhere indoors3. When the device detects the presence of water, high humidity, or extreme temperatures, it will send push notifications to the homeowner by phone, email, or text. The device, shaped like a drop of water, can be configured through an iOS or Android app and chained together with two other devices to cover large areas and prevent potential disasters. 

Prescriptive Maintenance 

Appliances, including water heaters, may fail unexpectedly, disrupting daily life. Almost all homes in the United States have water heaters, and 27 million households have water heaters that are more than 10 years old and nearing the end of their functional lives4. Property damages from leaking or burst water heaters can add $1,500 to the $3,000 price tag for water heater replacements. 

Manufacturers like V-Guard are integrating AI into water heaters to allow predictive performance analysis, enabling prescriptive maintenance that can identify issues before they become critical5. Their IoT-enabled, smartphone-compatible Verano water heater runs self-diagnostics to detect problems and alert homeowners with automated messages about maintenance issues. Verano also continuously monitors the water temperature and sends alerts when it rises above set levels. Homeowners can operate Verano anywhere in the world with the V-Guard Smart App, enabling scheduling for showers or automated cut-offs when water temperatures rise above a set level. 

Flowing Forward - The Impact of IoT on Plumbing - Water heater

Water Quality 

Americans drink about a billion glasses of water a day, and 58% worry about the safety of their drinking water6. Although the drinking water in the United States is among the safest in the world, contamination—such as in Flint, Michigan—does occur. 

Kohler has designed its Aquifer Refine water purification system to sit under a home’s kitchen faucet and improve water quality7. It features a three-stage filtration system to filter out mercury, lead, chlorine, certain pharmaceuticals, viruses, and bacteria. Wi-Fi connectivity allows homeowners to monitor their water usage, filter life, and receive alerts about potential leaks. The system also connects to Amazon’s Alexa, enabling homeowners to automatically order new filters when the current filter life reaches 10%. 

Outlook for IoT and Plumbing 

These examples of integrating IoT and AI into plumbing systems are just a few of the devices currently enhancing the lives of homeowners. As technology continues to advance, we can anticipate the following: 

 Enhanced energy efficiency that uses AI algorithms will further optimize water heating and usage, contributing to more sustainable and eco-friendly homes. Wider adoption of smart devices will make them more affordable, making smart plumbing solutions commonplace in households. And plumbing systems will seamlessly integrate with other smart home devices, creating comprehensive ecosystems that enhance overall convenience and efficiency. 

Smart devices are transforming residential plumbing into dynamic, responsive, and intelligent home systems. As manufacturers continue to innovate, the outlook for IoT and plumbing is undoubtedly one of continued growth and increased integration into the fabric of modern living. 

How Ambiq is Contributing 

Smart sensors or devices that connect to in-home appliances and send data-based information about plumbing and energy usage will require continuous monitoring and a long-lasting battery so as not to fail when it is needed the most.  

Ambiq makes a wide range of system-on-chips (SoCs) capable of processing complex data at ultra-low powers for key technologies, giving developers the flexibility to make intuitive monitoring devices consumers can rely on. It uses the patented Subthreshold Power Optimized Technology (SPOT®) platform to reduce the total system power consumption of all battery-powered endpoint devices. The result is a smart home that connects to all your devices. 

Sources: 

1 Water Damage Statistics and Information | 2023 
2 The 6 Best Water Leak Detectors for Your Home | September 18, 2023 
3 Moen | 2023 
4 What causes a water heater to leak or rupture? | November 30, 2017 
5 V-Guard | 2023 
6 Water Quality Facts and Stats | 2023 
7 Kohler | 2023 

Staying Safe this Winter with Wearables - Woman and watch

As the coldest months approach, learning more about how new technology can keep people safer in harsh conditions becomes more important and relevant. Thanks to the Internet of Things (IoT), wearable devices can potentially keep people safe and even warm from the adverse effects of winter, including extreme temperatures, storms, icy conditions, and more. Especially for people who work outside during the winter or travel long distances, wearable IoT devices and technology are an attractive and affordable safety solution for winter conditions. 

Since 1979, more than 19,000 Americans have died from cold-related causes, including cardiovascular diseases, respiratory viruses1, hypothermia, and frostbite. As fitness trackers and smartwatches become smaller, more affordable, and more energy-efficient, they become viable options for continuously monitoring vitals, temperatures, locations, and other safety-related stats during winter. 

How Wearables Keep People Safe in Winter 

Wearables are designed to monitor movement, alert potential dangerous changing conditions, and track vitals like body temperature, heart rate, oxygen, and more. However, in the winter, they take on another role, monitoring external environments and conditions. Individuals become more prone to accidents, such as stepping on slippery ice or getting lost in a snowstorm. Wearables can offer life-saving features that notify care teams of your location. Wearables help keep people safer, and winter is no exception. 

Smart Clothing 

Temperatures can quickly drop in the winter, and these rapidly changing conditions can be dangerous, especially in storms. Your best defense against the cold outside is ensuring you’re wearing warm clothing, with some manufacturers taking it a step further with embedded temperature sensors in smart clothing that adapt to the weather. 

Ministry of Supply is a Boston-based clothing company founded by three MIT students who created a smart heated jacket called the Mercury 2. The Mercury uses an internal and external thermometer and an accelerometer that measures temperature and movement. It then sends this information to heated pads that can warm the jacket up to 135ºF/57ºC – about the same temperature as a cup of coffee. It uses AI to detect a user’s preference, ensuring temperatures are always perfect, and connects to your Alexa so you can warm it up before you leave the house. 

Location Tracking 

One of the best uses of wearables for safety is through GPS and location tracking. It’s not uncommon to get stranded somewhere due to snow during winter, especially when skiing or snowboarding. With geolocation abilities, wearables can help you find your way back to civilization no matter where you are. 

The Garmin fēnix® 7 is a smartwatch with loads of geolocation features, including access to multiple global navigation satellite systems such as GPS, GLONASS, and Galileo3. It supports downloading topographic and satellite maps directly on the watch, and with outstanding battery life, the fēnix® 7 is ready to get you out of a tough jam. 

Staying Safe this Winter with Wearables - Fall

Safety communications 

Many wearables, such as smartwatches, are making it easier than ever to call emergency services in the event of an accident. From fall detection to hypothermia scares, wearables with sensors and tracking ability can detect car accidents, falls, or other dangerous accidents. Thanks to slippery ice, falls are more common in winter and can be utilized to alert caregivers or emergency contacts in case you become incapacitated to make a call.  

The Apple Watch, for example, calls emergency services if it detects you have fallen. If you become immobile after a fall, the Apple watch will detect inactivity after a minute and start to vibrate on your wrist to get your attention while sounding off a siren to get the attention of anyone nearby. After another minute, it will alert emergency services and emergency contacts to your location so you can get the care you need.  

Limitations of Winter Wearables 

Like other types of wearable fitness trackers, winter wearables have limitations when it comes to accuracy, affordability, and battery life. Wearable sensors might become dislodged during extreme activity, limiting their accuracy and effectiveness. Data might not be 100% accurate, and high costs can be a barrier for some. There are also concerns about privacy and security regarding storing personal health information. Regardless of these limitations, manufacturers are developing smarter and more capable devices that fit any budget, are more accurate, and protect the user’s data. 

The Future of Safer Winters with Wearables 

The global wearables market is projected to hit $142 billion by 20305, and over 30% of Americans are currently using endpoint intelligence like health trackers6. As adoption increases, the technology and enhancement of wearables can become more specialized with artificial intelligence to focus specifically on winter conditions. This might include enhanced location tracking, easier ways to contact emergency services, or more temperature alerts for chronically ill individuals. 

Mental health also takes a hit in winter, thanks to fewer hours of sunlight, more time indoors, and a drop in serotonin levels. In the future, wearables with augmented reality and virtual reality enhancements could provide immersive, sun-filled experiences that may increase serotonin levels. 

How Ambiq Contributes 

Battery-powered IoT devices fueled with an energy-efficient System On Chip (SOC) like the ones from Ambiq help smart devices become winter-ready with power-efficient processing solutions that perform complex inferencing while sipping power. Wearables can operate on a single charge for days, weeks, and even months, ensuring the health tracker is on stand-by during icy, snowy, or freezing winter temperatures. In the event of a life-saving emergency, you can count on an Ambiq-powered device. 

Sources: 

1 Climate Change Indicators: Cold-Related Deaths | 2021 
2 Ministry of Supply’s new jacket can heat itself and listens to your voice commands | February 23, 2018 
3 Garmin fēnix® 7 – Standard Edition | Multisport GPS Smartwatch | 2023 
4 Use Fall Detection with Apple Watch – Apple Support | 2023 
5 Wearable Technology Market Size, Share, Analysis, Trends, Growth Report, 2030 | October 4, 2023 
6 Study reveals wearable device trends among U.S. adults | June 15, 2023 

Trends-in-IoT-to-Watch-in-2024---City

It’s safe to call the Internet of Things (IoT) a fact of life with a projection that there will be around 17 billion devices in 20241. And unlike some sectors, exponential growth isn’t leading to rapid stagnation. IoT is constantly advancing, and ground-shaking innovations are par for the course. 

2024 is already shaping up to deliver major transformations, and big things are happening at the broader societal level. Here are four trends to look out for in the days to come: 

Architectures Become More Flexible 

As technology evolves and becomes more diverse, the demand to create more cohesive networks that developers can seamlessly build across various platforms will become more prevalent.  

Open-sourced architecture has become a winning solution. By allowing collaboration, improvements, and flexibility, developers can use open-source architectures to adapt to their application’s specific needs. This results in agile environments, higher scalability, and continuous improvements. 

Recently, Xiaomi announced the Vela Platform at their IoT Ecological partner conference. The Vela platform is an embedded software platform built on the open-source real-time operating system NuttX. Vela provides unified software services on various IoT hardware platforms, supporting multiple components and a user-friendly framework to integrate fragmented IoT application scenarios.  

It has a range of tools that help developers debug and is already picking up support from big names in integrated circuits including Ambiq with our Apollo4 Blue Display Kit designed to show off the graphical and technical capabilities of our system on chips (SoCs) for IoT devices. 

Ultra-low Power Consumption Will Start to Become the Norm 

AI is becoming increasingly integral to business workflows. However, depending on an external cloud isn’t always viable when your sensors and other assets are on-premises. If your sector leans heavily on IoT devices, you’ll have to rethink how you use power to keep up. 

Artificial Intelligence (AI) is known for consuming massive amounts of power. While the training phase commonly shoulders most of the blame, studies show that inference is also energy-hungry2. This doesn’t bode well for use cases like stock management, early disease diagnosis, and other innovative applications. 

Ultra-low-power SoCs will take center stage in the push for IoT-powered AI sustainability. The increasing availability of AI-enabled endpoint devices will only spur demand. Low-voltage technology like Ambiq’s Subthreshold Power Optimized Technology (SPOT®) has proven that high-end computing is possible without massive energy consumption. Now, it’s just a matter of putting the pieces together. 

IoT Cybersecurity Will Blossom for Endpoint Devices 

AI is a powerful tool, so what happens when it winds up in the wrong hands? Unfortunately, we’re already starting to see the ramifications play out. AI-generated hacks are emerging among the biggest cybersecurity threats of the modern era.  

AI can make finding exploits far less time-consuming and effective. Worse, the wide accessibility of cloud technology makes it all the more appealing to cyber criminals3. This is already happening in the consumer sector, with potentially devastating results. Email security provider Egress revealed that AI makes phishing campaigns harder to detect4. It’s also helping bad actors sidestep quarantines and detectors. 

The good news is that fighting fire with fire could be a valid defense. ML models like support vector machines and random forests have been shown to detect suspicious activity with high accuracy and efficient memory usage5. If your most vulnerable assets become compromised, embedding AI-capable hardware right on the endpoint could be an effective countermeasure to regain control over those assets or minimize the damage. 

Trends-in-IoT-to-Watch-in-2024---Handshake

Regulations May Grow Sharper Teeth 

While not an exciting trend, IoT regulatory compliance is becoming more of a necessity by the second. Governments may be warming up to IoT like never before, but they haven’t thrown all caution to the wind. 

In early December 2023, the US Office of Management and Budget (OMB) exemplified this in a new guidance. To comply with the OMB’s rules, federal agencies must provide detailed inventories of their IoT devices by the end of fiscal year 20246

The US wasn’t the only nation to move in this direction. In October, the UK government finalized a sweeping set of security regulations for connectable products — devices that either connect to the internet or send and receive data via EM transmissions7. These rules targeted manufacturers, covering everything from password strength to updates. 

It wouldn’t be surprising to see vendors face similar requirements. Also, data privacy laws like the EU’s GDPR have long since established that being a foreign-domiciled enterprise doesn’t exempt you from compliance burdens. Regulatory crackdowns don’t necessarily mean application growth has to slow; it just needs a more comprehensive approach to governance. Hardware vendors can and should play a role in this journey. 

How Ambiq is Contributing 

These predictions aren’t exclusive to 2024. They’ll continue to pose concerns in the years to follow — and waiting to play catch up isn’t quite a winning strategy. 

Ambiq’s ultra-low power portfolio empowers you to meet increased processing demand and adapt to the changing landscape. From driving Arm® Cortex® M4 energy consumption lower than thought possible with the Apollo4 Blue Plus SoC to ensuring always-on devices live up to their name. We’re helping IoT reach its true potential. Dive into our catalog to find out more. 

Sources: 

1 Internet of Things (IoT) in the US – statistics & facts | December 19, 2023
2 OMB guidance asks agencies to provide inventory of IoT assets | December 6, 2023
3 Prepare for AI Hackers | March-April 2023
4 2023 Phishing Threat Trends Report | 2023
5 Detecting Cybersecurity Attacks in Internet of Things Using Artificial Intelligence Methods: A Systematic Literature Review | January 10, 2022
6 The growing energy footprint of artificial intelligence | October 18, 2023
7 UK government finalises IoT cybersecurity requirements | October 25, 2023 

Scott Hanson Founder and CTO of Ambiq Interview with Safety Detective 1200x800.JPG

In a recent interview with SafetyDetectives, Scott Hanson, the Founder and CTO of Ambiq, delved into the creation and evolution of the SPOT platform, which focuses on building the world’s most energy-efficient chips. Originating from Hanson’s time as a PhD student at the University of Michigan, where he developed tiny systems for medical implants, the SPOT platform utilizes Subthreshold Power Optimized Technology to achieve unprecedented energy efficiency.

Hanson discussed the platform’s departure from conventional digital chip designs, emphasizing the significant energy savings achieved by operating at low voltages. The interview covered Ambiq’s role in addressing technical challenges in the IoT industry, emphasizing the crucial aspects of power efficiency and security. Hanson also expressed his optimism about the future of ultra-low power technology, foreseeing continuous improvements and a surge in compute power for IoT devices in the next 5-10 years.

Can you describe the journey that led to the creation of the SPOT platform?

Hi, my name is Scott Hanson, and I’m the founder and CTO of Ambiq.

Ambiq is a company that builds the world’s most energy-efficient chips. We’re putting intelligence everywhere; that is really the tagline. We want to make chips with such low power that we can embed little microprocessors and everything in your clothing, paint on the walls, bridges we drive over, pet collars, etc.

The company is built around a technology that we call SPOT, or Subthreshold Power Optimized Technology. It’s a low-power circuit design technology platform from my time at the University of Michigan. I was a PhD student there, and we were building tiny systems for medical implants.

When I say tiny, I mean really small. We’re talking about one cubic millimeter containing a microprocessor, a radio, an antenna, sensors, and a power source. When you’re building a system like that, the first thing you figure out is that the battery’s tiny, so the power budget that corresponds to that battery must also be tiny.

We got to thinking in terms of picowatts and nanowatts, right? We usually think about watts and kilowatts in the normal world, but we had to think about picowatts and nanowatts. When we did that, we could build these cubic millimeter systems; they could be implanted in the eyes of glaucoma patients to measure pressure fluctuations. It was exciting that this SPOT platform made all of that possible.

During that project, I started to see a lot of interest from companies in the technology, and it got me thinking that this technology had commercial potential. I remember riding in an elevator at the University of Michigan in the Computer Science building and realizing that the technology would be commercialized one day, and I needed to be the person to do that.

Shortly afterward, I started Ambiq with my two thesis advisors At Michigan, and we managed to raise some money. Then we launched our first few products, and here we are 13 years later, having shipped well over 200,000,000 units of our SPOT-enabled silicon. It’s been a really fun journey.

How does the SPOT platform fundamentally differ from traditional microcontroller technology?

I’m going to avoid giving you the full Circuits 101 lecture here. Conventional digital chips are not just microcontrollers but any chip signals digital ones and digital zeros using voltage.

A digital zero might be 0 volts, and a digital one will be a much higher voltage at 1 or 1.2 volts. That higher voltage is chosen because it’s easy to distinguish between zero and one. It’s also chosen because 1 or 1.2 volts is much higher than the turn-on voltage of the transistor.

Every modern chip is basically made of transistors, the fundamental building block.
We’ve got billions of transistors on these chips, which look like little switches. Think of them like a light switch. When the voltage applied to that transistor is above a turn-on voltage, what we call the threshold voltage, it turns on. A drop below the threshold voltage, and it turns off. You can see how you can string these things together and get devices that signal zeros and ones.

If you look at your Circuits 101 textbook, they teach you to apply a voltage well above the threshold voltage to turn on voltage, and it functions as a proper digital switch. Pretty much every chip out there operates in that way. However, at Ambiq, with SPOT, we ignore that convention. We represent the 1 with a much lower voltage; think of 0.5, 0.4, or 0.3 volts. If that voltage is below the turn-on voltage of the transistor, we call that subthreshold. If it’s at or near that turn-on voltage, it’s called near threshold.

It turns out that by shrinking what a digital one is, you get some huge energy savings. Energy equals voltage squared, so it’s quadratic with voltage, and therefore, you get this huge energy reduction by operating at low voltage.

That comes with all kinds of idiosyncrasies, and it’s what’s kept all the other companies away. Transistors start behaving weirdly but still operate like a transistor or switch, which becomes tough to manage. Our SPOT platform is about dealing with those idiosyncrasies or those subthreshold challenges, and it works.

The subthreshold and near-threshold technology has been around for decades. Ambiq was the first to really commercialize it widely, and our timing was perfect. About ten years ago, as the company was getting off the ground, the Internet of Things (IoT) was just starting to explode. Battery-powered devices were going everywhere, and there was this insatiable hunger for low power. The SPOT platform came along at the right time, and we’ve solved a lot of power problems, but the need for more compute continues to grow.

AI is popping up everywhere. We’re seeing that it’s not just in the cloud but also beginning in the endpoint devices like the ones we serve, such as consumer wearables or smart home devices. AI is straining the power budgets of all these devices, and that means that we’ve got to continue to innovate and release new products that are lower power.

What are the environmental implications of the widespread adoption of ultra-low power technologies?

Lower power is good for the environment, and it’s a green technology. If I have power needs that are that big, with low power technology, I can reduce the need to a lower amount, that’s good. However, the truth is a little bit more complex than that.

What tends to happen is that our customers don’t necessarily take advantage of consuming less energy by having smaller batteries and charging less often. Instead, they tend to add new functions. They’ll say, you have a more power-efficient processor? Then, I’ll add more stuff to that same power budget. So, the power footprint of the world is not really decreasing; it’s just that we’re able to get more done in that same power footprint.

It’s probably a wash in terms of environmental impact. That said, the IoT as a whole has some pretty fantastic potential for the environment. As we put sensors all over our homes and buildings, we could put climate sensors all over the world. We have a better sense of what’s going on in the world, whether it’s climate change and we’re able to track that, or it’s how much energy this building is using, and whether we are leaving lights on in a hallway where there’s nobody, there’s nobody present.

So there’s a potential to use the Internet of Things to dramatically reduce our energy usage, meaning manage buildings, energy consumption, managed homes, and energy consumption better, but also get a better sense of what’s happening in the world.

So I think there is the potential for our technology to be used in a very positive way.
But most of our customers tend to be using it just to kind of get more out of their existing power footprint.

What are the biggest technical challenges facing the IoT industry today?

Power is one of them, and it’s one that we’re addressing diligently every day. We want to put billions of devices everywhere, and you don’t want to be changing billions of batteries? So, having a low-power platform like SPOT is really critical.

Security is a growing concern. Io T has been developed very quickly and often without a proper eye toward security needs. The average person has tens of IoT devices in their home now, and they are collecting all kinds of intimate data about us. It’s collecting health information and movement patterns in my home. We’re seeing virtual assistants constantly capturing our speech to listen for the “Alexa” keyword or the “OK Google” keyword, and that’s not stopping, right?

There’s this insatiable appetite for more AI; deep neural networks are exploding everywhere. We’re going to see this constant need for processing in the cloud, which means we’re sending data up to the cloud.

That’s a privacy problem, right?

There are many ways to handle that. There’s a lot of good, interesting security hardware security software popping up. However, I’m going to say that probably the most effective solution is pretty simple – don’t send as much data to the cloud.
Do most of the processing at the endpoint, such as on the smartwatch, smart thermostat, and the Echo device in your house.

There’s no need to send all the data up to the cloud. It can be processed locally. It turns out that’s a power problem, too. If I say that instead of sending 100% of the raw sound data from an Amazon Echo up to the cloud, we’ll only send the 1% that’s interesting enough to send, it means there needs to be local processing. It must be done on the smartwatch, thermostat, Echo, or devices with sensors capturing the data.

That’s a power problem, especially as the neural networks running to support these use cases are getting bigger. Fortunately, SPOT is a great solution. We’re doing a lot of architectural innovation to ensure our customers can run big, beefy neural networks locally and on devices like this. I’m confident that we can attack that problem. However, I foresee a few years of rocky security problems here in the next few years.

How do you see the role of AI and machine learning in the evolution of IoT devices?

What I think is going to happen is that we’ll see a migration of AI from purely a data center thing to the endpoints, which means wearables, smart home devices, your car, or devices with sensors.

We’re seeing our customers run lightweight activity tracking or biosignal analysis on wearables. They’re embedding virtual assistants in everything, whether it’s a wearable, hearable, or smart home appliance. In all these cases, there’s a balance between the endpoint and the cloud. The neural network can have a lightweight front end that runs locally; if it identifies something interesting, it then passes it to the cloud for further analysis.

What I’m really excited about is the potential for large language models (LLM), such as chatGPT, that are trained with enormous amounts of data, largely from the Internet. However, they don’t have eyes and ears; they’re not in the real world understanding what’s happening. That’s the role of the endpoint devices, your wearables, smart home devices, or smart switches. Those devices are constantly capturing information about us.

If they could run lightweight neural networks locally to identify activities or events of interest, they could ship those to the cloud to a ChatGPT-like model. Imagine if your wearable monitors your vitals, heart rate, breathing, and talking, and it identifies trends of interest and sends those up to the AI.

I’m not talking about megabytes or even gigabytes of the data it’s constantly collecting. I’m talking about sending a few little snippets – a few little observations. For example, you had a high activity date today, or sleep was not very good last night. You send that up to the cloud, and then you can ask it more useful questions like –  Hey, I haven’t been feeling great lately, what’s wrong? The AI would be able to answer with something along the lines of I’ve been watching you for the last six months, and I see that your sleep has been irregular. You need to get more regular sleep, and here’s what you can do to fix that, right?

There are countless examples of things like this where endpoint devices can collaborate with large language models in the cloud to achieve fantastic results.

Now, there are obvious security problems there. We just talked about how security problems are one of the major challenges facing IoT.  That’s no different here, and it’s a problem that needs to be managed. However, if you do most of the processing locally, we can effectively manage the security issues. I think between the endpoint and the cloud,  there’s a way to address the security problems that pop up there. And I think there’s real power in what can be achieved for the endusers.

How do you see ultra-low power technology evolving in the next 5-10 years?

The good news is that I see it improving with really no end in sight. We’re going to see far more compute power packed into shrinking power budgets.

Moore’s law is alive and well for the embedded world. We’re at a process node today that’s 22 nanometers. The likes of Qualcomm, Intel, and others are down below 5 nanometers, so we’ve got a long way to go to catch up to them.

Moore’s law is going to deliver all kinds of gains. That means faster processors and lower power processors. We’re also doing a ton of innovation on the architecture, circuits, and software. I don’t see an end to power improvement, certainly not in the next decade.

Just look at how far Ambiq has come in the last ten years. We have this family of SoCs called Apollo. The first one was launched in 2014 and ran at 24 MHz. It had a small processor and less than 1 MB of memory. Our latest Apollo4 processors have many megabytes of memory. They run at nearly 200 MHz. They have GPUs and USB interfaces, consuming 1/8 of the power of our initial product. So we’re getting dramatically faster, dramatically low power that will continue.

If you just extrapolate those numbers going forward, we’re going to have an amazing amount of compute power for all your IoT devices, and that’s exciting.

I don’t 100% know exactly what we’ll do with all that compute, but I do know that I’ve got customers asking me every day for more processing power, lower power, and they’re going to be doing some pretty exciting things.

So, I’m excited about where we’re going from here.

This interview originally appeared on Safety Detectives with Shauli Zacks on January 4, 2024

A 2020 Look Into Computer Vision - Outside

Often without realizing it, most humans are now regularly engaging with sophisticated computer vision technology. Things such as password authentication that used to require a password or fingerprint now need little more than a glance at your smartphone.  

Forty percent of Americans are utilizing face biometrics and facial recognition technology with at least one app per day, and that adoption increases to 75% among 18 to 34-year-olds1. The global computer vision market is expected to grow at an exponential compounded annual growth rate of 19.6% by 2030 2, and other deep learning techniques have widened the scope of what’s possible with today’s computer vision technology. 

From unlocking smartphones to walking through facial scanners at airports, computer vision technology is rapidly integrating into our everyday lives. 

What is Computer Vision? 

Computer vision is a subfield of computer science and artificial intelligence that utilizes computers and systems to gather meaningful information from images. Computer vision extrapolates data from images to make decisions. Its ultimate goal is correctly identifying objects and people to take appropriate action, such as avoiding a pedestrian on a walkway in a self-driving car or accurately identifying a smartphone user who can unlock their phone. 

How Does Computer Vision Work? 

Computer vision technology aims to mimic the human brain’s process of recognizing visual information. Utilizing pattern recognition, it absorbs inputs, labels them as objects, and finds patterns that produce familiar images. Computer vision works to derive meaning from images while cataloging visual data from the real world. 

The History of Computer Vision 

Like many fields of artificial intelligence, the first forays into computer vision occurred decades ago. In the 1960s, researchers used algorithms to process and analyze visual data. By the 1970s, this technology had become more accurate at image processing and pattern debt recognition. 

Over the next several decades, scientists utilized machine learning algorithms to power most computer vision technology, culminating in one of the largest breakthroughs of the time, the Viola-Jones face detection algorithm. This algorithm is still used today as a core machine-learning object detection framework. As technology rapidly progressed in the 2000s, convolution neural networks enabled computers to detect objects and track movement with even greater accuracy. 

Real-Life Applications of Computer Vision 

Computer vision technology is revolutionizing many industries, from improved cancer detection to self-driving cars. Tools like facial recognition, object detection, and augmented reality offer multiple use cases for real-life applications. 

Autonomous Vehicles 

Tesla is a well-known example of self-driving vehicles, but Hyundai also recently invested in a deep-learning computer vision startup to apply the technology to its autonomous vehicles 3. Computer vision, a core functionality of autonomous vehicles, empowers self-driving cars to make sense of their surroundings; sensors and hardware gather billions of visual data points to create an image of what is happening outside the vehicle. From stop signs to hazardous objects on the road to pedestrians to other cars, computer vision algorithms improve the safety and efficiency of self-driving cars. 

Cancer Detection 

AI rapidly evolves in the healthcare industry, and cancer detection is no exception. X-ZELL is a company that uses sophisticated computer vision technology to enable same-day cancer diagnosis from imagery4. Computer vision uses advanced algorithms and machine learning to analyze medical images like X-rays, MRIs, and CT scans to identify potential signs of cancer with higher accuracy. Computer vision learns upon massive data sets, and it can accurately identify subtle patterns and features that might be difficult for humans to pick up on. In healthcare, this can improve patient outcomes, enhance treatments, and ultimately save lives. 

Security in Schools and Public Areas 

For increased school and public area security, Visio.ai combines edge computing with on-device machine learning and sophisticated vision systems5. From high-traffic walkways in airports to intrusion detection in universities and schools, deep learning intelligence can be merged with common surveillance cameras to use facial recognition analysis to measure emotions and detect suspicious activities. Computer vision technology offers the opportunity to improve safety throughout public areas like schools, airports, transportation systems, and more. 

Manufacturing Settings 

Manufacturing settings are full of opportunities for computer vision technology, from quality inspections to production monitoring to supply chain logistics. For example, in quality inspections, computer vision can automatically detect defects, scratches, and other anomalies. With radio frequency identification (RFID), computer vision technology can track products across supply lines, optimizing inventory, production schedules, and delivery. From improved supply chain logistics to ensuring consistent quality for semiconductors, computer vision supports better lighting, better product consistency, increased efficiencies, and more. 

A 2020 Look Into Computer Vision - Eye

Challenges of Computer Vision 

Computer vision and endpoint intelligence offer seemingly limitless opportunities for advancement in critical sectors. While safer vehicles and faster cancer diagnosis don’t seem problematic, the intelligence behind computer vision has challenges. 

Privacy Concerns 

Privacy and security are top concerns, like with many artificial intelligence tools. The risk for data breaches is high, and with sensitive, confidential information stored in potentially unsecured platforms, cybercriminals are highly incentivized to attack. Consumers worry about giving too much personal data to technology companies, and with cybercrime on the rise, computer vision AI tools need to ensure they’ve shored up their defenses. 

High Costs 

Currently, computer vision technology is not cheap to implement, and especially in more sophisticated use cases, the cost of purchasing hardware and software and performing maintenance is high. Add in large data sets that need to be cleaned, stored, and maintained; computer vision technology is extra costly. Also, the maintenance of these systems is expensive, and predictive maintenance is necessary to fix potential equipment defects before they become a bigger issue. 

Lack of Trained Experts 

While computer vision is rapidly evolving, few companies or individuals have vast expertise. As with any newer technology, it will take time for education and training to catch up with real-life applications adequately. Companies struggle to maintain specialized tech talent, and computer vision is no exception. Organizations also need trained experts on the differences between artificial intelligence, machine learning, and deep learning to train systems adequately. 

The Future of Computer Vision 

Computer vision technology is still in its infancy, but society has already seen its vast impact across manufacturing, education, security, retail, healthcare, the automotive industry, and more. There is so much opportunity for consumer computer vision technology as the desire for the Internet of Things (IoT) devices accelerates—virtual reality headsets, augmented reality smart glasses, and more. As hardware becomes more sophisticated yet affordable, computer vision wearables and smart gadgets can trickle down to the average person. Also, as generative AI and deep learning accelerate, computer vision models will have more inputs from which to learn. 

How Ambiq Contributes 

Computer vision technology requires an embedded chip capable of processing machine learning inferencing. For this technology to be practical on endpoint devices, it needs to perform at low power and run at maximum efficiency. Ambiq’s ultra-low power System-on-Chips (SoCs) enable endpoint devices with optimal performance and energy efficiency that can perform locally on the device.  

Our friends at Northern Mechatronics (NMI) recently performed digit recognition on their flagship NM180100, which was enabled by Ambiq’s Apollo3 SoC. Numbers were identified and returned in less than 2-seconds. See it for yourself: 

Sources: 

1 New CyberLink Report Finds Over 131 Million Americans Use Facial Recognition Daily and Nearly Half of Them to Access Three Applications or More Each Day | November 22, 2022 
2 Computer Vision Market Size, Share & Trends Analysis Report By Component, By Product Type, By Application, By Vertical (Automotive, Healthcare, Retail), By Region, And Segment Forecasts, 2023 – 2030 | 2021 
3 Hyundai Invests in Deep Learning Computer Vision Startup allegro.ai | May 11, 2018 
4 X-Zell | 2023 
5 Top 9 Applications of AI Vision in the Education Sector | 2023 

A 20/20 Look into Computer Vision
Our website uses cookies

Our website use cookies. By continuing navigating, we assume your permission to deploy cookies as detailed in our Privacy Policy.

Preparing to download