Ask the Expert: Joining Ambiq to Enable Endpoint AI

April 6, 2022 by Carlos Morales

Ambiq’s Ask the Expert blog series offers industry insight and technology expertise from the minds of Ambiq’s brightest professionals. 

In this article, Ambiq’s Vice President of Artificial Intelligence, Carlos Morales, illustrates his knowledge, commitment to excellence, and a love for AI, as he answers the question Why did you join Ambiq? 

Integrated semiconductor microchip microprocessor

I landed my first job through a happy misunderstanding. My resume listed ‘microcode’ experience, which most companies would recognize as an esoteric kind of coding I had used for a college project. Not IBM, though: ‘microcode’ was their word for ‘firmware.’ After a very confusing interview, I moved to Silicon Valley and joined IBM.

Nine companies, four startups, and countless projects later, I have become much more intentional in my searches.

When I first heard of Ambiq®, I nearly dismissed their claims of leveraging sub-threshold silicon design approaches out-of-hand: many other companies had tried to make sub-threshold work and failed, sometimes spectacularly. It was ‘common knowledge’ that sub-threshold wasn’t practical: sure, you could build a test chip, and maybe piece together a demo for CES, but the complexity, design, and manufacturing challenges made it impractical. Imagine my surprise and excitement when I realized Ambiq hadn’t just solved the sub-threshold puzzle but had taken that solution to market to the tune of millions of devices per month.

I was floored.

My next concern was whether it was realistic to deploy AI to the endpoints that Ambiq’s devices typically target – battery-powered devices with compute and memory resources multiple orders of magnitude smaller than AI usually deals with. In AI, and particularly in the Deep Learning world, memory starts at 32GB, and compute resources start at hundreds of trillions of operations per second and skyrocket from there, often occupying acres of datacenter space.

Here, again, I was pleasantly surprised, discovering several strong trends all converging to enable sophisticated AI on these tiny flecks of silicon:

  • First, a cadre of AI researchers had turned their sights on making AI smaller, leading to significant algorithmic advances in quantization, pruning, and distillation techniques.
  • Second, the software stack had matured to the point where a vibrant ecosystem of AI developers came into being, generating innovative and interesting AI applications.
  • Third, these two trends led to the creation of tools that democratized AI, leveraging techniques such as AutoML and low/no-code development to bring AI to everyone with data and an idea.
smart city and communications network concept

All that was missing was a way to bring these innovations to the mass market in a practical way. For example, consider smartwatches: many of their features are heavily reliant on AI (it’s how they know if you’re running or how they listen for your commands), but the intense computational requirements meant that either they ran out of battery in a day, or that they had to rely on a datacenter to perform the computation, using battery to transmit all that stuff back and forth, slowing the interaction down, and exposing your personal data in the process. Ideally, all that AI should run right on your wrist.

And that is where Ambiq’s electricity-sipping superpower came in:

Ultra-efficient compute is the final trend needed to enable sophisticated AI everywhere

Female hand with smartwatch and health applications

SPOT® means I can compute something ten times (or more) using the same power it usually takes to compute it once. It takes a while to mentally adjust to being an order of magnitude more capable than you are used to – it’s like jumping and suddenly finding yourself above the rooftops. The second jump, though, is a blast. Once you make that mental adjustment and start realizing what you can do with that extra power, that is where things get exciting.

Technology is a difficult, challenging field, especially when you choose to be right at the bleeding edge as I have throughout my career. When everything is ‘version zero,’ and you are creating everything from scratch, wins are rare and hard-fought. It is always satisfying, but not always fun.

And that is something else Ambiq has reminded me of: Delighting your customers is fun. Bragging about your AI capabilities is fun. Winning is fun.

Over the next few months, I’ll be writing about some of the topics below:

  • Constrained AI Research Trends and Techniques
  • Power-Saving Tips and Tradeoffs
  • New AI-Driven Features That Will Soon Pop Up on Endpoints Everywhere

Stay tuned for more articles on this topic.

Written by Carlos Morales
Carlos Morales
Written by Carlos Morales

Carlos Morales, Vice-President of Artificial Intelligence at Ambiq, has over 30 years of research and development experience spanning silicon to cloud. Besides AI, his past roles include building expertise in Cloud-based back-end applications, cybersecurity, workload scheduling, orchestration, and isolation, and efficient networking.

Preparing to download