Advertisements

Tuesday, 2 July 2019

Samsung On-Device AI technology for AI Deep Learning announced

Earlier last month, Samsung expanded its proprietary NPU technology development and also delivered an update to the same. At the Computer Vision and Pattern Recognition (CVPR), the company introduced its On-Device AI lightweight algorithm and today, Samsung has announced an update to the same. Samsung has announced that the company has successfully developed On-Device AI lightweight technology that performs computations 8 times faster than the existing 32-bit deep learning data for servers. A core feature of On-Device AI technology is its ability to compute large amounts of data at a high speed without consuming excessive amounts of electricity. How On-Device AI lightweight technology works? The technology determines the intervals of the significant data that influence overall deep learning performance through learning and Samsung Advanced Institute of Technology (SAIT) is said to have run experiments that successfully demonstrated how the quantization of an in-server deep learning algorithm in 32 bit intervals provided higher accuracy than other existing solutions when computed into levels of less than 4 bits. The computation results using the QIL process can achieve the same results as existing processes can while using 1/40 to 1/120 fewer transistors when the data of a deep learning computation is presented in bit groups lower than 4 ...