Not quite the universal translator that we see in Star Trek for years, they just kind of waved a handout and how can we have them when we go to all these planets, how can we just have a normal conversation? Everyone speaks English everywhere we go in the whole universe up now. So, they cooked up that idea about this universal translator. Well on their way. That video is four, five years old. I don't know how far they've come, but it's pretty cool. The other class I talked about the cocktail party problem and I posted this PDF to anyone who have a chance to look at it. I just wanted to flash up a couple of pages of it here. So, I'm going to go through all of this, but they used machine learning algorithms to improve the quality. I put a red box around this so it says- and it worked. In a study involving 24 test subject we demonstrated that this program could boost the comprehension of hearing impaired people by about 50 percent which is really, really huge, and it goes through a description of the training that they did, and they trained and then work offline and then they embedded the trained algorithm into the hearing aid when it was deployed in it's deployed state. So, this one has- their network had an input layer and 123 hidden layers, and then had an output layer because all this interconnection much like snap these in our brains. These neurons with access go into other neurons. Very similar. I thought this was cool. So, even people with normal hearing we're able to better understand noisy sentences, which means our program could someday help far more people than originally anticipated. Listeners with normal hearing understood 37 percent of the words spoken amid a steady stream of noise without the program, and 80 percent with it. For the babel their performance improved from 42 percent to 78 percent. So, it improved everyone's hearing. I thought was pretty cool. It's cool article