Attitudes Shift to Internet of Things and Smart Homes – ReadWrite

The (AIoT) The “artificial internet of things”, a technological ecosystem, emerged during the pandemic. Then the Smart Home was developed.

AIoT combines connected things (IoT) and artificial intelligence (AI) that are used within these things.

These last 12 months have been challenging. The pandemic wreaked havoc around the world and now people realize that covid-19 is here forever.

We now accept this fact and look for ways to adapt our lives and interactions with the world. To ensure that people live safe, productive, and happy lives, governments, industries, and businesses are constantly changing the status quo.

People have had to make changes in how and where they work. Over the past year, working from home has become the norm. Businesses can continue to allow employees to work remotely as long as employees remain productive. Working from home has led to a renewed emphasis on the importance of work and the value of our homes. Tech-enabled smart home discussions are now more timely than ever.

Smart homes and all the technology involved is still a very young industry. Last year, research determined the obstacles preventing AIoT from becoming a reality. Electronics engineers identified significant market and device-level issues in that investigation. The researchers then did the same study a year later to see how things had improved. The header? What headline? No results reported.

AI has security concerns due to its reliance on data. The more information a device needs, the brighter it is. Engineers have discovered that local data processing can solve privacy problems. Households can keep their data on their walls without sharing it with third parties in the cloud. The mere reduction of third-party cookies reduces the risk of data leakage.

Smart House

A smart home can be used to store data so that a distant cybercriminal doesn’t have to become a common thief to steal it. Although this is unlikely to happen, device manufacturers must ensure that data processing on their devices is secure.

You can have significantly greater security around your data and decision-making by using various device-level security features, such as secure key storage, accelerated encryption, and true random number generation.

Engineers felt that connectivity was a major barrier to AI deployment. However, only 27% of industry professionals see connectivity as a major technology hurdle, and 38% expressed concern about the technology’s ability to overcome latency issues. For example, home healthcare monitoring can’t afford to be hampered by poor connectivity when it comes to making decisions about life-changing circumstances like heart attacks. However, using on-device processing makes network latency irrelevant.

If the industry wants to develop latency-free applications, it should switch to on-device computing. Product manufacturers can now run some AIoT chips in nanoseconds, allowing products to think quickly and make decisions accurately.

AIoT

Engineers also highlighted the scaling issue last year. Engineers know that the number of connected devices continues to increase, putting more pressure on cloud infrastructure. About 25% of engineers believe that scalability is a barrier to edge technology success in 2020. However, experts are beginning to recognize the deep-seated scalability advantages of IoT.

The cloud is no longer a factor in processing at the edge, negating any potential scaling and growth issues. Currently, less than a fifth of engineers believe that cloud infrastructure can hold back cutting-edge AI.

The good news? The electronics industry does not have to do anything to ensure the scalability of IoT. One of the main technical obstacles to the expansion of IoT is the need for cloud processing to handle billions of devices and petabytes more in the future, which has now been eliminated.

Increase power capacity, decrease power consumption

The AIoT market has grown over the past year. It has also advanced on a technical level. Processing capabilities on the AI ​​device have improved while lowering the power required and expense. Chip owners can now tailor chips to various AIoT needs in a price payable point.

How can engineers transition to using AIoT chips as a realistic option for product manufacturers?

The development environment is a crucial consideration. New chip architectures often mean immature and untested proprietary programming platforms that engineers must learn and become familiar with.

Instead, engineers should look for places that can afford to use industry standard methods with which they are familiar. Industry standard methods include full programming and runtime environments such as FreeRTOS, TensorFlow Lite, and C. Engineers can quickly program chips using friendly platforms without learning new languages, tools, or techniques.

Having a single programming environment that can handle all the computing requirements of an IoT system is critical. The capacity of compute requirements will always be the key to enabling the design speed needed to bring fast and secure AI home in the new post-Covid era.

Image credit: Kindel Media; pexels; Thanks!

Deanna Richie

Editor-in-Chief at ReadWrite

Deanna is the managing editor of ReadWrite. Previously, she worked as the editor-in-chief of Startup Grind and has over 20 years of content development and management experience.

Leave a Comment