The Augmented Cane helps people with impaired vision navigate by using sensors to understand the environment and feedback modes to help guide and inform the user.
The Augmented Cane uses its variety of sensors to overcome the challenges of helping a person with impaired vision navigate different challenges. The LIDAR measures distances to obstacles, the inertial measurement unit (IMU) provides orientation estimates, the GPS measures outdoor position, and the camera takes images.
Globally, more than 250 million people have impaired vision and face challenges navigating outside their homes, affecting their independence, mental health, and physical health. Navigating unfamiliar routes is challenging for people with impaired vision because it may require avoiding obstacles, recognizing objects, and wayfinding indoors and outdoors. Existing approaches such as white canes, guide dogs, and electronic travel aids only tackle some of these challenges. Here, we present the Augmented Cane, a white cane with a comprehensive set of sensors and an intuitive feedback method to steer the user, which addresses navigation challenges and improves mobility for people with impaired vision. We compared the Augmented Cane with a white cane by having sighted and visually impaired participants complete navigation challenges while blindfolded: walking along hallways, avoiding obstacles, and following outdoor waypoints. Across all experiments, the Augmented Cane increased the walking speed for participants with impaired vision by 18 ± 7% and sighted participants by 35 ± 12% compared with a white cane. The increase in walking speed may be due to accurate steering assistance, reduced cognitive load, fewer contacts with the environment, and higher participant confidence. We also demonstrate advanced navigation capabilities of the Augmented Cane: indoor wayfinding, recognizing and steering the participant to a key object, and navigating a sequence of indoor and outdoor challenges. The open-source and low-cost design of the Augmented Cane provides a platform that may improve the mobility and quality of life of people with impaired vision.
At the beginning of the 20th century, a diagnosis of Type 1 diabetes was a death sentence. Starvation diets were employed to delay the life-threatening symptoms of diabetes, but patient death was inevitable.
Beginning on May 17, 1921, Frederick Banting and Charles Best, under the direction of J. J. R. Macleod, isolated what would later be known as insulin in a lab at the University of Toronto. Their extract was further purified and made safe for human injection by James Collip.
Thirteen-year-old Leonard Thompson was selected to receive their first human trial, the results of which would go on to save the lives of millions around the world.
Fiona Amery is a PhD Candidate in History and Philosophy of Science, University of Cambridge
It’s a question that has puzzled observers for centuries: do the fantastic green and crimson light displays of the aurora borealis produce any discernible sound?
Conjured by the interaction of solar particles with gas molecules in Earth’s atmosphere, the aurora generally occurs near Earth’s poles, where the magnetic field is strongest. Reports of the aurora making a noise, however, are rare – and were historically dismissed by scientists.
But a Finnish study in 2016 claimed to have finally confirmed that the northern lights really do produce sound audible to the human ear. A recording made by one of the researchers involved in the study even claimed to have captured the sound made by the captivating lights 70 metres above ground level.
The joint European Space Agency (ESA) – Japan Aerospace Exploration Agency (JAXA) BepiColombo spacecraft captured this view of Mercury on Oct. 1, 2021 during the first of six flybys on its voyage to orbit the planet in 2025.
Two spacecraft built by Europe and Japan captured their first up-close look at the planet Mercury in a weekend flyby, revealing a rocky world covered with craters.
The two linked probes, known together as BepiColombo, snapped their first image of Mercury late Friday (Oct. 1) during a flyby that sent them zooming around the planet. The encounter marked the first of six Mercury flybys for BepiColombo, a joint effort by the space agencies of Europe and Japan, to slow itself enough to enter orbit around the planet in 2025.
How do zebras get their stripes? How do leopards get their spots? And how do giraffes get their giraffe-shaped thingies, whatever they are called? Would you believe the answer is… math? This is the story of a WWII wartime codebreaker and his quest to decode nature’s most beautiful patterns. Alan Turing uncovered a simple code that explains everything from stripes to spots and all the patterns in between… he was just too far ahead of his time. Only recently have biologists found evidence that his pattern-forming system
“We are laser-focused on the analysis of stool,” says the Duke University research professor, with all the unselfconsciousness of someone used to talking about bodily functions. “We think there is an incredible untapped opportunity for health data. And this information is not tapped because of the universal aversion to having anything to do with your stool.”
Joe Public Newsletter
Sign up for the Joe Public Newsletter — where I share the best things I read. Straight to your inbox twice a month. It’s Free!
Never miss an update. No spam. No kerfuffle. Unsubscribe any time. Learn more »