Apple’s Next Level AR LiDAR Sensor Begins

Lidar sensor

At the unveiling of iPad pro-2020, Apple brought out a sleek-esque looking iPad. The company introduced trackpad support and the A12Z Bionic chip, but most prominent was the Apple LiDAR sensor.

Apple added LiDAR sensor in iPad Pro 2020; the evolved version of the depth-sensing TrueDepth front-facing imaging technology seen in 2017 iPhone X. In 2020, Apple plans to add the depth-sensing capable ‘Apple LiDAR sensor’ in its upcoming iPhone and Apple Glass.

BUT

What is LiDAR Sensor?

The LiDAR technology throws light in the form of laser beams to the target measuring the reflecting light through LiDAR Sensor analyzing the distance between the target and sensor. The concept may sound new, but the truth is, it is everywhere. 

Some instances include Self-driving cars, massive industry automation, robots, drones, equipment surveys, and security. 

Car Lidar Scanner

LiDAR sensors have broadened possibilities and help understand the properties surrounding us. 

Seeing Apple’s vision of the LiDAR sensor for AR assistance shows the company’s high hopes for LiDAR technology in AR.

Apple’s iPad Pro 2020 LiDAR Sensor doesn’t match up with professional Lidar scanners like those for surveying and scanning outdoors. It can measure the surroundings 5 meters away and works both indoor and outdoor. The sensor performs at the photon level at nano-second speeds, says Apple:

The breakthrough LiDAR Scanner enables capabilities never before possible on any mobile device. The LiDAR Scanner measures the distance to surrounding objects up to 5 meters away, works both indoors and outdoors, and operates at the photon level at nano-second speeds. New depth frameworks in iPadOS combine depth point measured by the LiDAR Scanner, data from both cameras and motion sensors, and is enhanced by computer vision algorithms on the A12Z Bionic for a more detailed understanding of a scene. The tight integration of these elements enables a whole new class of AR experiences on iPad Pro.

Apple.com

How will LiDAR transform Apps Experience?

Measure 

The Measure app introduced in iOS12 becomes faster with LiDAR sensor and makes it easier to measure the length, meanwhile with edge detection, objects become easy to measure with precision. 

Complete Anatomy

The in-depth LiDAR sensor allowed the makers of Complete Anatomy to add a mobility assessment tool to help physicists and nurses in tracking their patient’s progress over time for faster recovery. 

IKEA Place

IKEA Lidar scanner in iPad Pro 2020

The IKEA app has a library of home furnishing range and, the app is ready to add a Studio Mode for IKEA Place which will take advantage of the LiDAR scanner to go from placing single furniture to intelligently furnishing your whole room in AR. Studio mode will suggest you Room sets like those on IKEA Store according to your current aesthetics like furniture, style, etc.

Hot Lava

Hot lava AR Lidar Scanner

The floor is Hot Lava. Do whatever you can. Just don’t touch the floor. Hot Lava’s AR mode takes advantage of the LiDAR technology to step up your hot lava game, and add interaction that removes the line between the digital world and reality.

Apple realized that LiDAR improves AR experience more than what camera technology ever could. People following sensors technology coincide with Apple.

Industries have evaluated camera-only AR experiences with algorithms for 3d spatial information but experienced glitches and unfavourable results. They knew it can’t stand the test of time. 

Conclusion

Apple LiDAR technology packs lots of potential, and with enhancements in the years, is expected to modify the way we interact with our world.

Get it hands-on at the best price:

Sharing is caring!

Leave a Reply

Your email address will not be published. Required fields are marked *

shares