Why is Edge AI Important?

Ready to learn Data Science? Browse courses like Data Science Training and Certification developed by industry thought leaders and Experfy in Harvard Innovation Lab.

This article is part of Demystifying AI, a series of posts that (try to) disambiguate the jargon and myths surrounding AI.

Imagine having to run to your local library and flip through the pages of an encyclopedia every time you saw a dog or cat in the street and wanted to know what its species is. That is pretty much how artificial applications function currently.

Artificial intelligence can predict stocks, diagnose patients, hire job applicants, play the games of chess and go, and do many more tasks on par or better than humans. Humans still have an advantage however: They have intelligence at the edge.

Most of the things humans do are processed and performed by their brain, the source of computational power that is within the direct proximity of their limbs. Where their own processing power and memory aren’t enough to solve a problem, they can tap into knowledge that is located at a further location. This can be anything such as visiting the library or sitting behind a computer and googling an unknown term. In case of the cat and dog premise, we’d eventually learn the names of species and refer to our own memory when seeing a new animal.

In contrary, most mobile apps, Internet of Things and other applications that work with AI and machine learning algorithms applications must rely on processing power sitting in the cloud or at a datacenter at thousands of miles away, and have little intelligence to apply at the edge. Even if you show your favorite yogurt to your smart fridge a thousand times, it’s still have to look it up in its cloud server in order to recognize it the 1001st time:

The do this very efficiently, faster than any human could possibly run through a catalog of products. But they still don’t have a fraction of the processing power that humans do at the edge.

The problem with machine learning algorithms is that they are both computational- and data-intensive. This limits the environments where artificial intelligence can be deployed, especially as we gradually move toward a world where computation is gradually moving from information technology into operation technology, and this is why we need to see the development and deployment of technologies for fog computing, which enable the performance of AI functions at the edge.

The limits of cloud-based AI

>

Thanks to broadband internet connectivity, web APIs for AI engines have sub-second response times. This is excellent for predictive and prescriptive analytics such as product suggestion, sales forecasts, diagnosis of diseases, processing job application data, composing music and much more.

But it’s not enough for real-time operations where milliseconds matter.

For instance, if a robot surgeon wants to perform a sensitive operation on a patient, it will need to be able to analyze images and possibly make decisions several or even dozens of times per second. The round trip to the cloud would cause lags that could have dire and irreversible consequences.

The same rule applies to driverless cars and robots and drones that have to move around obstacles and navigate complex terrain and situations. There’s already a push for edge algorithms on police surveillance cameras, where they need to process and analyze a fast flow of imagery in real-time.

These applications also rely on sending a large amountس of information to the cloud, which causes a new set of problems. One regards the sensitivity of the information. Sending and storing so much information in the cloud will entail security and privacy challenges. Application developers will have to consider whether the deluge of information they’re sending to the cloud contains personally identifiable information (PII) and whether storing it is in breach of privacy laws. They’ll also have to take the necessary measures to secure the information they store and prevent it from being stolen, or accessed and shared illegally.

Another limitation that cloud-based AI poses is for environments where there’s limited or no connectivity, whether it’s because of lack of communications infrastructure or because of the sensitivity of the operations and information involved. The only alternative to cloud servers are proprietary data centers that cost heavily to set up and maintain.

Remote locations such as countryside farms, which can benefit immensely from artificial intelligence, will have limited access to AI applications because of their poor connectivity. As IoT moves into more eccentric and disconnected environments, the necessity of edge or fog computing will become more prevalent.

What are the trends

>

There are several ways that AI can be pushed to the edge and help expand the domains of its applications.

  • Distributed computing: A lot of the computing power across the world goes to waste as devices remain idle. While the processing power of those devices might not be enough to perform data-intensive AI algorithms, their combined resources will be able to tackle most tasks. Blockchain, the technology that underlies cryptocurrencies, provides an interesting solution to create decentralized computers from numerous devices. Blockchain is especially suited for IoT environments.
  • AI co-processors: As GPUs helped drive new innovations in the digital imagery such as gaming and rendering, AI coprocessors can drive similar advances in the AI industry. Until now, GPUs have been used for the same purpose because of their immense power in performing parallel operations. The trend has pushed companies like Nvidia, which were exclusively geared toward graphics processing, to make inroads into the field of AI. We’re now seeing the emergence of external AI processors such as the Movidius Neural Compute Stick, which provides deep learning computing power at the edge.
  • Advanced algorithms: Scientists and researchers are working on algorithms that can more closely mimic the human brain function, which requires less data to understand concepts and make decisions. This can help lower the barriers to bring AI closer to the edge.

The development and combination of these trends can hopefully make it possible to execute AI algorithms closer to where operations are taking place. Edge computing will not be a replacement for the power of the cloud. It can however make AI’s operation model resemble that of humans: perform routine and time-critical decisions at the edge and refer to the cloud where more intensive computation and historical analysis is needed.

Originally posted at Tech Talks Blog

  • Top articles, research, podcasts, webinars and more delivered to you monthly.

  • Leave a Comment
    Next Post

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    Big Data, Cloud & DevOps
    Cognitive Load Of Being On Call: 6 Tips To Address It

    If you’ve ever been on call, you’ve probably experienced the pain of being woken up at 4 a.m., unactionable alerts, alerts going to the wrong team, and other unfortunate events. But, there’s an aspect of being on call that is less talked about, but even more ubiquitous – the cognitive load. “Cognitive load” has perhaps

    5 MINUTES READ Continue Reading »
    Big Data, Cloud & DevOps
    How To Refine 360 Customer View With Next Generation Data Matching

    Knowing your customer in the digital age Want to know more about your customers? About their demographics, personal choices, and preferable buying journey? Who do you think is the best source for such insights? You’re right. The customer. But, in a fast-paced world, it is almost impossible to extract all relevant information about a customer

    4 MINUTES READ Continue Reading »
    Big Data, Cloud & DevOps
    3 Ways Businesses Can Use Cloud Computing To The Fullest

    Cloud computing is the anytime, anywhere delivery of IT services like compute, storage, networking, and application software over the internet to end-users. The underlying physical resources, as well as processes, are masked to the end-user, who accesses only the files and apps they want. Companies (usually) pay for only the cloud computing services they use,

    7 MINUTES READ Continue Reading »