Would you consent to a surveillance system that watches without video and listens without sound?

If your knee-jerk reaction is “no!”, then “huh?” I’m with you. In a new paper in Applied Physics Letters, a Chinese team is wading into the complicated balance between privacy and safety with computers that can echolocate. By training AIto sift through signals from arrays of acoustic sensors, the system can gradually learn to parse your movements—standing, sitting, falling—using only ultrasonic sound.

To study author Dr. Xinhua Guo at the Wuhan University of Technology, the system may be more palatable to privacy advocates than security cameras. Because it relies on ultrasonic waves—the type that bats use to navigate dark spaces—it doesn’t capture video or audio. It’ll track your body position, but notyou per se.

AI echolocation
An array of acoustic emitters and receivers (green chip) gathers ultrasonic waves to teach an AI to detect human movement. Image Credit: Xinhua Guo.

When further miniaturized, the system could allow caretakers to monitor elderly folks who live alone for falls inside their house, or track patient safety inside hospital rooms. It could even be installed in public areas—trains, Ubers, libraries, park bathrooms—to guard against violence or sexual harassment, or replace video cameras in AirBnB homes to protect both the property and the guest’s privacy.

Because the system only detects body movement, there’s no issue with facial recognition—or perhaps any identification—based on recordings alone. The system doesn’t even generate blob-like body shapes that grace US airports’ body camera screens. It’s monitoring, yes, but with a thin veil of privacy, similar to leaving semi-anonymous comments online.

At least that’s the pitch. If you’re a tad skeptical, so am I. San Francisco recently banned facial recognition technology, and New York may soon follow with even more stringent surveillance rules. But in countries where security cameras range in the hundreds of thousands and privacy isn’t necessarily a basic right, an echolocating monitoring system may better appease folks who are increasingly uncomfortable having their every move watched and recorded.

“Protecting privacy from the intrusion of surveillance cameras has become a global concern. We hope that this technology can help reduce the use of cameras in the future,” said Guo.

How Does It Work?

The team took a hint from bats and other animals that use echolocation as their main navigational tool.

To echolocate, you need two main types of hardware: a sensor, such as a microphone, that blasts out ultrasonic waves to bounce off surfaces, and a receiver that collects the reflected waves. According to Guo, previous attempts at machine echolocation often only used a single microphone and a handful of sensors. Effective, yes, but not efficient, like a somewhat handicapped bat.

“The recognition accuracy was not very high” at roughly 90 percent, the authors said. For a system to work as well as video cameras, the accuracy needs to be near perfect.

The team set up four emitters in a three-dimensional space, with each shooting out sound waves at 40kHz—roughly two times above the highest hearing ability of a healthy young person. To capture the rebounding waves, they used a 256 acoustic array, perfectly lined up on a single surface in a 16 by 16 grid. Both transmitters and receivers are physically located on a single chip-like structure, visually similar to round seeds dotting a green lotus pod.

Each time a volunteer moves in front of the array—standing, sitting, falling, or walking—the receiver scans through the blanket of reflected sound waves, one row at a time. In all, the team had four people of different heights and weights come in, allowing the system to better generalize a particular data pattern to a movement rather than to a certain person.

Now comes the brainy part. To mimic bat-brain processing in computers, the team turned to a convolutional neural network (CNN), the golden child and driving horse of many current computer vision systems. They designed an algorithm that first pre-processes all the echolocation data to strip out noise—anything the sensors picked up outside the target 40kHz, give or take 5kHz for leniency.

The algorithm then parsed the gathered data over time to fish out movement patterns, similar to how brain-machine interfaces find muscle intent in neural electrical signals. For example, sitting reflects a slightly different pattern of sound waves than standing or falling. Similar to other deep neural networks, it’s impossible to explain how each body position differs in terms of echolocation, but the acoustic fingerprints are distinct enough for the algorithm to successfully parse out the four tested activities 97.5 percent of the time.

In general, the algorithm seems to better identify static activity such as sitting and standing rather than movements. This is expected, explained the authors, because falling and walking introduce individual differences in how people move, making it harder for computers to figure out a general acoustic pattern.

Big Brother?

Guo’s study further expands a relatively new field called human activity recognition. Here, computers try to predict the movement of a person based on sensor data alone. It might sound incredibly “big brother,” but anyone who has a FitBit, Apple Watch, or other activity trackers has already reaped the benefits—human activity recognition is how your smartwatch counts your steps using the embedded gyroscope. The field also encompasses video surveillance, such as computers figuring out what a person is doing based on pixels in images or videos. Have a Microsoft Kinect? That nifty box uses infrared light, a video camera, and depth sensors to identify your movement while gaming.

“Human activity recognition is widely used in many fields, such as the monitoring of smart homes, fire detecting and rescuing, hospital patient management, etc,” the authors explained.

As sensors increasingly become lightweight the technology will only expand. In 2017, a Chinese-American collaboration found that it’s possible to track human movements based on surrounding WiFi alone. Many of these systems are still too large to be completely portable, but hardware miniaturization is almost inevitably in the future.

Not everyone protests against increased monitoring. Caregivers in particular might appreciate the technology to alert them to the elderly falling—something innocuous when young but deadly after a certain age. The authors envision a fully automated system in which a fall automatically alerts multiple sources of help, without necessarily leaking what the person is doing before the fall.

But good intentions aside, Guo’s system screams of misuse potential. In stark contrast to facial recognition, little discussion has so far focused on privacy issues surrounding human activity tracking. According to Jake Metcalf, a technological ethicist at the think tank Data & Society in New York, the system could easily be repurposed to listen in on peoples’ private lives, or combined with existing technologies to further increase surveillance coverage.

For now, Guo’s team is reluctant to weigh in on the privacy ramifications. Rather, his team hopes to further tailor the system to more complex activities and “random” situations where a person may be lounging about.

“As we know, human activities are complicated, taking falling as an example, and can present in various postures. We are hoping to collect more datasets of falling activity to reach higher accuracy,” he said.

Source: SingularityHub

Artificial Intelligence is changing banking, health, business, and the military. But so far, it has been slow to go big in K-12 education, said Scott Garrigan, a professor at Lehigh University at a session at the International Society for Technology in Education's annual conference here.

But that is likely to change in the coming years, he said. No sector will be untouched by AI.

"AI will change society. It will produce changes as big as the automobile," Garrigan said. "We have no idea what's going to happen as AI rolls out massively. But there will be massive, massive change. AI is the new electricity. I can't think of any industry AI will not transform." Ultimately, that will include K-12 schools too, he said.

Here were some of the big questions for educators to tackle:

How Will AI Change Curriculum?

Calculus and arithmetic won't be as important, Garrigan predicts. Instead, schools will likely begin emphasizing statistics and probability. And they'll be less of an emphasis on performing hard calculations, because that's something machines can already do.

"Who doesn't have access to a calculator?" he asked. "Spending twelve years to help kids be the equivalent of a two -step calculating algorithm, that's absurd."

He's betting schools will shift away from programming in Java and move toward other computing languages, like Python, that have greater application with AI.

What's more, students will need to grasp AI itself. Not just its technical implications, but the societal ones too.

"Teachers and students need to understand this stuff," Garrigan said. That's because AI will bring about "not just technological but social change," including creating jobs that don't currently exist.

Some schools are already beginning to make these shifts. More from Ed Week here.

Will AI Change Teaching and District Management?

In short: Yes, Garrigan said. In fact, that's already happening, to some extent, he suggested.

Innova Schools in Lima, Peru are using IBM's Watson to scan resumes for teacher hiring, he said. "They discovered that credentials on a resume can't tell them how well a teacher will do in their environment," he said. But they've trained Watson to spot those qualities. Schools can also use it to flag which students are likely to suffer from mental health issues, such as suicide or depression. And AI is in some personalized learning software and adaptive testing.

As for teachers? Down the line they may have "AI partners." These partners could do the "dirty work" (like some grading) while the teacher does the "fun work" (like connecting with and encouraging students). (The flip side of that scenario, many teachers worry, is that they will be replaced by machines.)  

How Will Educators Cope With AI's Flaws?

One big question for the future: "Will AI be a decider, as in 'oh the AI system says this is what I should do" or will it be an "adviser?"

It's critical for schools to keep in mind that AI isn't always going to spit out perfect solutions. It will make mistakes, just like the human brain it's modeled on, Garrigan said. "AI will look for probabilities, not answers," Garrigan said. "AI comes out with probability distributions. AI systems have error, it's built in. you can't escape it. Forget perfection."

What's more, because AI systems must take in data to become more accurate educators should, "expect massive issues with privacy." AI also has some serious bias problems that have a major impact:

Source:Education week

Artificial Intelligence is changing banking, health, business, and the military. But so far, it has been slow to go big in K-12 education, said Scott Garrigan, a professor at Lehigh University at a session at the International Society for Technology in Education's annual conference here.

But that is likely to change in the coming years, he said. No sector will be untouched by AI.

"AI will change society. It will produce changes as big as the automobile," Garrigan said. "We have no idea what's going to happen as AI rolls out massively. But there will be massive, massive change. AI is the new electricity. I can't think of any industry AI will not transform." Ultimately, that will include K-12 schools too, he said.

Here were some of the big questions for educators to tackle:

How Will AI Change Curriculum?

Calculus and arithmetic won't be as important, Garrigan predicts. Instead, schools will likely begin emphasizing statistics and probability. And they'll be less of an emphasis on performing hard calculations, because that's something machines can already do.

"Who doesn't have access to a calculator?" he asked. "Spending twelve years to help kids be the equivalent of a two -step calculating algorithm, that's absurd."

He's betting schools will shift away from programming in Java and move toward other computing languages, like Python, that have greater application with AI.

What's more, students will need to grasp AI itself. Not just its technical implications, but the societal ones too.

"Teachers and students need to understand this stuff," Garrigan said. That's because AI will bring about "not just technological but social change," including creating jobs that don't currently exist.

Some schools are already beginning to make these shifts. More from Ed Week here.

Will AI Change Teaching and District Management?

In short: Yes, Garrigan said. In fact, that's already happening, to some extent, he suggested.

Innova Schools in Lima, Peru are using IBM's Watson to scan resumes for teacher hiring, he said. "They discovered that credentials on a resume can't tell them how well a teacher will do in their environment," he said. But they've trained Watson to spot those qualities. Schools can also use it to flag which students are likely to suffer from mental health issues, such as suicide or depression. And AI is in some personalized learning software and adaptive testing.

As for teachers? Down the line they may have "AI partners." These partners could do the "dirty work" (like some grading) while the teacher does the "fun work" (like connecting with and encouraging students). (The flip side of that scenario, many teachers worry, is that they will be replaced by machines.)  

How Will Educators Cope With AI's Flaws?

One big question for the future: "Will AI be a decider, as in 'oh the AI system says this is what I should do" or will it be an "adviser?"

It's critical for schools to keep in mind that AI isn't always going to spit out perfect solutions. It will make mistakes, just like the human brain it's modeled on, Garrigan said. "AI will look for probabilities, not answers," Garrigan said. "AI comes out with probability distributions. AI systems have error, it's built in. you can't escape it. Forget perfection."

What's more, because AI systems must take in data to become more accurate educators should, "expect massive issues with privacy." AI also has some serious bias problems that have a major impact:

Source:Education week

Teachers don't always know how well their methods work. They can ask questions and hand out tests, of course, but it's not always clear who's at fault if the message doesn't get through. AI might do the trick before long, though. Dartmouth College researchers have produced a machine learning algorithm that measures activity across your brain to determine how well you understand a given concept.

The team started out by having rookie and intermediate engineering students both take standard tests as well as answer questions about pictures while sitting in an fMRI scanner. From there, they had the algorithm generate "neural scores" that could predict a student's performance. The more certain parts of the brain lit up, the easier it was to tell whether or not a student grasped the concepts at play.

You're not about to get brain scans in between classes, and there are limitations to the existing research. For one, Dartmouth focused on STEM learning -- it's not clear if your brain would react the same way in a literature class. The neural scores also apply only to narrow demonstrations of knowledge. This could, however, help teachers refine their classes by identifying techniques that resonate with most students before exam results come in. Don't be surprised if school is eventually much more engaging.

Source: Engadget

Speaking about the company’s efforts in strengthening digital capabilities, the company chairman said that Infosys has especially worked in the areas of experience, data, analytics, cloud, SaaS, IoT, cybersecurity, AI, and machine learning.

 
Infosys CEO Nandan Nilekani

Infosys is betting big on automation and Artificial Intelligence, which, it expects, will transform the businesses of its clients, Chairman Nandan Nilekani said at the tech company’s 38th Annual General Meeting on Saturday. However, the use of automation and Artificial Intelligence is not just for the company’s clients but also for its employees. “We are relying on extreme automation to free up our people to focus more than ever on solving client challenges, mentoring their teams and investing in continuous learning,” Nandan Nilekani said. Speaking about the company’s efforts in strengthening digital capabilities, the company chairman said that Infosys has especially worked in the areas of experience, data, analytics, cloud, SaaS, IoT, cybersecurity, AI, and machine learning.

Infosys, founded by N R Narayana Murthy, along with Nandan Nilekani and others in 1981, witnessed revenue growth of 9% in constant currency terms in the last financial year, and its total revenue now stands at $11.8 billion. Infosys’ digital revenue also grew at 33.8% and now accounts for a third of the company’s total revenues. Addressing the shareholders, directors and employees, Infosys chairman said that the company has generated 36% total shareholder return for fiscal 2019. “The Board of Directors has recommended a final dividend of Rs 10.5 per share for fiscal 2019. Coupled with an interim dividend of Rs 7 per share paid in October 2018 and a special dividend of Rs 4 per share paid in January 2019, the total dividend paid last year was Rs 21.50 per share,” Nandan Nilekani said.

Infosys has also made global acquisitions of late which include Brilliant Basics, WongDoody, and Fluido. Nandan Nilekani said that these acquisitions are seeing strong traction with the clients. Infosys had also partnered with Temasek in Singapore and South-East Asia, and Hitachi, Panasonic and Pasona in Japan.

The second largest tech company in India, Infosys attracts a large number of software engineers every year. Infosys had revealed that it spends about Rs 14 lakh on the training of each student, after which it takes them 12 weeks to become productive, Ravi Kumar, President and Deputy COO, earlier told in a recent interview to CNBC TV18.

Source: Financial Express

© copyright 2017 www.aimlmarketplace.com. All Rights Reserved.

A Product of HunterTech Ventures