Google’s New Baby Monitor Uses AI To Alert Parents Before Baby Wakes Up

Source: industryreport24.com

Parents can now feel more assured about their baby’s safety with the launch of Google’s latest baby monitor. Google has come up with an even more advance baby monitoring device which uses artificial intelligence to alert parents, minutes prior to their baby awakes from sleep.

Here, the new technology is using AI to ensure a baby’s safety and along with it easing out parents.

As per the Google’s patent application submitted in the U.S., the device is programmed to decide when the child is in NADS (Non-Auditory Discomfort State) and accordingly the caretaker will be informed at least 10 minutes prior to the waking up of the infant.

As per a Google document which was recently made public, this new device works on the principle of eye-tracking tech that could find the sleeping or awaken position of the baby. By using AI, audio recordings and by video streaming, the baby monitor could screen behavior of the baby comparing against databases of standard patterns.

The baby’s movements and noises would be analyzed by the device, based on which the machine could perceive anything altered and successively alerts the parents if the infant appears uncomfortable. Even then baby’s body posture will be taken into consideration that whether the body posture is of sleeping, standing, kneeling or something else. As per the patent, an alert would be generated subsequently by the machine even before the child wakes up or cries.

It has been stated by the patent that most of the time parents are alerted by the infant’s cry but sometimes a distress situation suffered by the baby might not be shown in the form of any cry. For e.g., when the infant is awake or instead of asleep, moving around, or worse, has got tangled or chocking, is in a distress state but not making any noise. This is where Google’s latest baby monitor will play the role by alerting parents about their baby’s security.

Related Posts

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x
Artificial Intelligence