Image Credit: analyticsindiamag
An edge device is a piece of hardware that controls data flow at the boundary between two networks. Edge devices function with different roles, depending on what type of device they are, however essentially serve as network entry points. Some examples of functions of edge devices are the transmission, routing, processing, monitoring, filtering, translation and storage of data passing between networks. Enterprises and service providers use edge devices.
The Edge al hardware market is heavily influenced by driving factors such as growth in demand for faster and efficient edge hardware devices that require lower processing time in aI applications also reduced costing and innovations in products available in the market is expected to drive the market growth. However, the lack of training or skills for the operations of devices and equipment integration in various aI devices is impacting negatively on the growth of this market in the current market scenario.
The power-efficient chipset is the main driver of edge AI. Mobile vendor Huawei is already introducing on-device AI training for battery power management in its P20 pro handset, in partnership with Cambricon Technologies. Chip vendors NVIDIA, Intel, and Qualcomm are also making a push to deliver the hardware that will enable automotive OEMs to experiment with on-device AI training to support their efforts in autonomous driving. Training at the edge on-device is beginning to gain momentum in terms of R&D, but it could still take some take some time for it to be a realist approach in most segments.
“The massive growth in devices using AI is positive for all players in the ecosystem concerned, but critically those players enabling AI at the edge are going to see an increase in demand that the industry to date has overlooked. Vendors can no longer go on ignoring the potential of AI at the edge. As the market momentum continues to swing toward ultra-low latency and more robust analytics, end users must start to incorporate edge AI in their roadmap. They need to start thinking about new business models like end-to-end integration or chipset as a service,” Vernon concludes.
Artificial Intelligence (AI) will see a significant shift out of the cloud and on to the edge (aka on-device, gateway, and on-premise server). This will happen initially in terms of inference (machine learning) and then by training. This shift means a huge opportunity for those chipset vendors with power-efficient chipsets and other products that can meet the demand for edge AI computing. Edge AI inference will grow from just 6% in 2017 to 43% in 2023
In addition, the report focuses on leading industry players with information such as company profiles, products and services offered, financial information of last 3 years, key development in past five years.
Cloud providers will still play a pivotal role, particularly when it comes to AI training. Out of the 3 billion AI device shipments that will take place in 2023, over 2.2 billion will rely on cloud service providers for AI training - this is still a real-term decline in the cloud providers market share for AI training, which currently stands around 99%, but will fall to 76% by 2023. Hardware providers should not be too concerned about this shift away from the cloud, as AI training is likely to be supported by the same hardware, only at the edge, either on-premise servers or gateway systems.
Edge al hardware market is segmented on the basis of processors, device, power consumption, process and by end user industry. Based on processors type the market is segmented as central processing units (CPU), graphic processing unit (GPU) and application-specific integrated circuits (ASICs).Based on device the market is segmented as smartphones, cameras, robots, wearable, smart speaker, automotive and smart mirror. On the basis of power consumption the market is segmented as less than 1 W,1-3 W,3-5 W,5-10 W and more than 10 W.