To connect the digital and physical world, Scape Technologies is building an engine to allow camera devices to understand their environment and anchor augmented reality (AR) content to specific outdoor locations. By using the AR Cloud, the company plans to 3D-map 100 cities around the world.
Already live in London, Scape Technologies’ Vision Engine processes imagery from any source, and uses the cloud to create a 3D representation of the environment that users can tap into with AR-capable devices. However, understanding a device’s precise location and orientation is one of the key challenges to tie digital information to specific physical spaces on a global scale.
Often, a GPS signal is used, and this provides enough of a basic understanding of location and orientation. However, GPS is only accurate to within 7 meters and can be even worse in an urban environment when bouncing off buildings. This means it would be impossible to have a multi-user experience, as everybody would have different frames of reference for where the AR content should appear. So, how did Scape solve this issue? Using the AR Cloud.
Introduced in 2017, the AR-Cloud serves as an index of the real world, which can be used to align both content and devices to the same reference frame. The system uses computer vision to provide a precise ‘anchor’ of where devices are in the world, with a higher degree of accuracy than GPS. Current AR SDKs, such as the ARKit and the ARCore, use computer vision to place AR content in the real world, but with a flaw. According to an ARKit’s article, persistent AR experiences can be unreliable due to lighting conditions or features of the local environment that change over time, affecting how a device perceives the environment. According to what Scape’s CEO Edward Miller wrote on a three-part series on building the AR Cloud, the company is developing a new method “to describe the 1’s and 0’s a camera interprets in a way which is faster, more reliable and more robust than any other method demonstrated to date”.
Earlier this year, Scape received $8M in seed funding, and enabled its Visual Positioning Service (VPS) in London for AR applications via ScapeKIT, the company’s SDK for iOS, Android and Unity development platforms. Over the last year, the company mobilized local teams on the ground, equipped with cameras, and captured over 2 billion images to build a 3D map of 100 cities, including Manhattan, San Francisco, Rio de Janeiro, Sydney, Paris, Moscow & Tokyo.
“Our processing is all cloud-based, which means that the user isn’t constrained by device memory and processing times, so it can be experienced using a simple hand-held device like a mobile phone or tablet; it also means we’re optimized for city-scale projects and we have collected street view imagery of over 100 cities to help support this,” Scape told SPAR 3D.
Scape believes it’s building a 3D map infrastructure that will connect multiple industries, such as robotics, autonomous vehicles, large-scale AR, and drones. For example, during the company’s first Hackathon, one of the teams presented a concept of a drone delivery service where the user could define drone landing and no-fly zones utilizing our technology.
“Because our service is cloud-based, it requires just a basic camera and data SIM card on the device – meaning it’s well suited to small products like drones, robots, and micro-mobility devices. Using computer vision technology will also be much more scalable and affordable than the multi-sensor systems which are used today.”
Additionally, Scape thinks the AEC industry is one of the core industries that will benefit from adopting this technology. Recently, the company demonstrated how the technology can be used in this regard.
“Over the years there has been a transition from 2D prints of designs, to 3D computer visualization; a VPS will provide the next step in 3D visualization, allowing site managers and clients to visualize and modify life-size 3D designs, outdoors in their intended geographical location,” Scape said. “Because ScapeKit returns device location and orientation in geographical coordinates, content can be anchored precisely to real-world locations and viewed persistently, bringing designs to life.”
Right now, the company is working on the infrastructure for what it calls “evolving maps”.
“This will ensure that our map data does not become outdated as a result of changing environments, but instead updates independently using new environmental information,” the company explained. “Having up-to-date maps is key to providing accuracy and precision and we believe it holds the success to the future of a Visual Positioning Service.”