Anzeige
Mehr »
Login
Freitag, 19.04.2024 Börsentäglich über 12.000 News von 689 internationalen Medien
Goldaktie: Eine Erfolgsgeschichte, die seinesgleichen sucht, startet gerade richtig durch!
Anzeige

Indizes

Kurs

%
News
24 h / 7 T
Aufrufe
7 Tage

Aktien

Kurs

%
News
24 h / 7 T
Aufrufe
7 Tage

Xetra-Orderbuch

Fonds

Kurs

%

Devisen

Kurs

%

Rohstoffe

Kurs

%

Themen

Kurs

%

Erweiterte Suche
PR Newswire
140 Leser
Artikel bewerten:
(0)

SoftKinetic becomes Sony Depthsensing Solutions

Belgian company enters new development phase within global electronics giant

BRUSSELS, Dec. 19, 2017 /PRNewswire/ --Two years after its acquisition, SoftKinetic becomes Sony Depthsensing Solutions.

"This transition is the culmination of our work as a Sony subsidiary over the past couple of years," states CEO Akihiro Hasegawa. "We are honored of becoming an integral part of the world's leading image sensing company, and we will continue working towards the integration of our DepthSensetechnology into products for mobile, robotics, and automotive industries worldwide."

SoftKinetic has indeed been a leader in 3D vision and gesture recognition technologies: it has continuously delivered state-of-the-art solutions in the field of 3D sensing and processing through the development of CMOS 3D sensors, 3D camera reference designs, SDKs, algorithms, and applications for gesture recognition, object scanning, automotive control, and AR/VR. All of which shall now be developed under Sony Depthsensing Solutions.

"We have great expectations for depth sensing technology," explains Sony Semiconductor Solutions Corporation Senior General Manager, Satoshi Yoshihara, "as we continue expanding the realm of senses for machines by enabling them with human-like sight."

A compelling achievement in this area has been the integration of DepthSensetechnology and gesture recognition software into premium vehicles and is the recent integration of DepthSensecamera module and software designed by the Brussels-based company into Sony's new Entertainment Robot "aibo". "Sooner rather than later machines, including cars, will be able to detect, recognize, and react toward human beings in ways we used to only think possible in fiction," adds Akihiro Hasegawa. The Brussels-based team is indeed committed to bringing not only consumer electronics products, but also robots and automobiles to the next level of evolution.

"We trust that 3D depth-sensing technology will revolutionize the way humans and machines interact in the near future, and this conviction is only reinforced by the confidence Sony has put in Sony Depthsensing Solutions to carry out this (r)evolution," concludes Hasegawa.

Press Contact

Laetitia Fernandez, Senior Marketing Manager
Laetitia.Fernandez@sony.com
+32.474.92.07.63

About Sony Depthsensing Solutions

Sony Depthsensing Solutions is a leader in 3D vision and gesture recognition technologies. The company's DepthSense3D CMOS Time-of-Flight sensors and cameras, together with advanced middleware, provide cutting-edge 3D vision capabilities for a wide variety of industries including gaming, AR/VR, PC, mobile and automotive. For more information on Sony Depthsensing Solutions please visit http://www.sony-depthsensing.com.

Logo: http://mma.prnewswire.com/media/620226/Sony_Logo.jpg

Großer Insider-Report 2024 von Dr. Dennis Riedl
Wenn Insider handeln, sollten Sie aufmerksam werden. In diesem kostenlosen Report erfahren Sie, welche Aktien Sie im Moment im Blick behalten und von welchen Sie lieber die Finger lassen sollten.
Hier klicken
© 2017 PR Newswire
Werbehinweise: Die Billigung des Basisprospekts durch die BaFin ist nicht als ihre Befürwortung der angebotenen Wertpapiere zu verstehen. Wir empfehlen Interessenten und potenziellen Anlegern den Basisprospekt und die Endgültigen Bedingungen zu lesen, bevor sie eine Anlageentscheidung treffen, um sich möglichst umfassend zu informieren, insbesondere über die potenziellen Risiken und Chancen des Wertpapiers. Sie sind im Begriff, ein Produkt zu erwerben, das nicht einfach ist und schwer zu verstehen sein kann.