Earlier, interest-free loans were announced for Emiratis whose businesses were affected by the rains, with a grace period of 6 to 12 months
uae5 hours ago
In a bid to give machines the ability to predict intent when interacting with humans, a team at the University of New South Wales (UNSW) Sydney is developing artificial intelligence-driven prototype human-machine interface system that will assist humans to be seen not merely as tools, but as partners.
Dr Lina Yao, a senior lecturer of engineering at UNSW and principal investigator, is busy getting AI systems and human-machine interfaces up to speed with the finer nuances of human behaviour.
The ultimate goal is for her research to be used in autonomous AI systems, robots and even cyborgs, but the first step is focused on the interface between humans and intelligent machines.
"What we're doing in these early phases is to help machines learn to act like humans based on our daily interactions and the actions that are influenced by our own judgment and expectations - so that they can be better placed to predict our intentions," Yao said in a university statement.
At the moment, AI may do a plausible job at detecting the intent of another person (in other words, after the fact).
It may even have a list of predefined, possible responses that a human will respond within a given situation. But when an AI system or machine only has a few clues or partial observations to go on, its responses can sometimes be a little robotic.
Dr Yao is working on less obvious examples of human behaviour integrated into AI systems to improve intent prediction.
Things like gestures, eye movement, posture, facial expression and even micro-expressions - the tell-tale physical signs when someone reacts emotionally to a stimulus but tries to keep it hidden.
"We can learn and predict what a human would like to do when they're wearing an EEG [electroencephalogram] device," said Dr Yao.
While wearing one of these devices, whenever the person makes a movement, their brainwaves are collected which researchers can then analyse.
"Later we can ask people to think about moving with a particular action - such as raising their right arm. So not actually raising the arm, but thinking about it, and we can then collect the associated brain waves".
Recording this data has the potential to help people unable to move or communicate freely due to disability or illness.
Brain waves recorded with an EEG device could be analysed and used to move machinery such as a wheelchair, or even to communicate a request for assistance.
According to Yao, autonomous AI systems and machines may one day look at us as belonging to one of three categories after observing our behaviour - peer, bystander or competitor.
"While this may seem cold and aloof, these categories may dynamically change from one to another according to their evolving contexts.
At any rate, she said, this sort of cognitive categorisation is actually very human.
- IANS
Earlier, interest-free loans were announced for Emiratis whose businesses were affected by the rains, with a grace period of 6 to 12 months
uae5 hours ago
Following a $3 per barrel average global price increase in March, petrol prices for April increased by 12 fils per litre in the UAE
uae5 hours ago
Annual rents in Dubai are usually paid in two, four or six instalments via post-dated cheques
business5 hours ago
NCM official says May 2-3 is identified as the ‘peak’ of the situation
weather5 hours ago
PSG are still involved in the Champions League in what could turn out to be the club's finest ever season
football10 hours ago
World number three Alcaraz, 20, missed a month prior to Madrid with a forearm issue
tennis11 hours ago
It was Chennai's fifth win from nine matches as they moved to third place in the points table with 10 points
cricket11 hours ago
Earlier, Arsenal held on for a pulsating 3-2 win over arch rivals Tottenham
football12 hours ago