×

Usamos cookies para ayudar a mejorar LingQ. Al visitar este sitio, aceptas nuestras politicas de cookie.


image

TED Talks, Pattie Maes and Pranav Mistry demo SixthSense

Pattie Maes and Pranav Mistry demo SixthSense

I've been intrigued by this question of whether we could evolve or develop a sixth sense -- a sense that would give us seamless access and easy access to meta-information or information that may exist somewhere that may be relevant to help us make the right decision about whatever it is that we're coming across. And some of you may argue, well, don't today's cell phones do that already? But I would say no. When you meet someone here at TED -- and this is the top networking place, of course, of the year -- you don't shake somebody's hand and then say, "Can you hold on for a moment while I take out my phone and Google you?" Or when you go to the supermarket and you're standing there in that huge aisle of different types of toilet papers, you don't take out your cell phone, and open a browser, and go to a website to try to decide which of these different toilet papers is the most ecologically responsible purchase to make. So we don't really have easy access to all this relevant information that can just help us make optimal decisions about what to do next and what actions to take. And so my research group at the Media Lab has been developing a series of inventions to give us access to this information in a sort of easy way, without requiring that the user changes any of their behavior. And I'm here to unveil our latest effort, and most successful effort so far, which is still very much a work in process. I'm actually wearing the device right now and we've sort of cobbled it together with components that are off the shelf -- and that, by the way, only cost 350 dollars at this point in time. I'm wearing a camera, just a simple webcam, a portable, battery-powered projection system with a little mirror. These components communicate to my cell phone in my pocket which acts as the communication and computation device. And in the video here we see my student Pranav Mistry, who's really the genius who's been implementing and designing this whole system. And we see how this system lets him walk up to any surface and start using his hands to interact with the information that is projected in front of him. The system tracks the four significant fingers. In this case, he's wearing simple marker caps that you may recognize. But if you want a more stylish version you could also paint your nails in different colors.

And the camera basically tracks these four fingers and recognizes any gestures that he's making so he can just go to, for example, a map of Long Beach, zoom in and out, etc. The system also recognizes iconic gestures such as the "take a picture" gesture, and then takes a picture of whatever is in front of you. And when he then walks back to the Media Lab, he can just go up to any wall and project all the pictures that he's taken, sort through them and organize them, and re-size them, etc., again using all natural gestures. So, some of you most likely were here two years ago and saw the demo by Jeff Han or some of you may think, "Well, doesn't this look like the Microsoft Surface Table?" And yes, you also interact using natural gestures, both hands, etc. But the difference here is that you can use any surface, you can walk to up to any surface, including your hand if nothing else is available and interact with this projected data. The device is completely portable, and can be ... (Applause)

So one important difference is that it's totally mobile. Another even more important difference is that in mass production this would not cost more tomorrow than today's cell phones and would actually not sort of be a bigger packaging -- could look a lot more stylish than this version that I'm wearing around my neck. But other than letting some of you live out your fantasy of looking as cool as Tom Cruise in "Minority Report," the reason why we're really excited about this device is that it really can act as one of these sixth-sense devices that gives you relevant information about whatever is in front of you. So we see Pranav here going into the supermarket and he's shopping for some paper towels. And, as he picks up a product the system can recognize the product that he's picking up, using either image recognition or marker technology, and give him the green light or an orange light. He can ask for additional information. So this particular choice here is a particularly good choice, given his personal criteria. Some of you may want the toilet paper with the most bleach in it rather than the most ecologically-responsible choice.

(Laughter)

If he picks up a book in the bookstore, he can get an Amazon rating -- it gets projected right on the cover of the book. This is Juan's book, our previous speaker, which gets a great rating, by the way, at Amazon. And so, Pranav turns the page of the book and can then see additional information about the book -- reader comments, maybe sort of information by his favorite critic, etc. If he turns to a particular page he finds an annotation by maybe an expert of a friend of ours that gives him a little bit of additional information about whatever is on that particular page. Reading the newspaper -- it never has to be outdated.

(Laughter)

You can get video annotations of the event that you're reading about You can get the latest sports scores etc. This is a more controversial one.

(Laughter)

As you interact with someone at TED, maybe you can see a word cloud of the tags, the words that are associated with that person in their blog and personal web pages. In this case, the student is interested in cameras, etc. On your way to the airport, if you pick up your boarding pass, it can tell you that your flight is delayed, that the gate has changed, etc. And, if you need to know what the current time is it's as simple as drawing a watch -- (Laughter) (Applause) on your arm. So that's where we're at so far in developing this sixth sense that would give us seamless access to all this relevant information about the things that we may come across. My student Pranav, who's really, like I said, the genius behind this. (Applause) (Standing ovation)

He does deserve a lot of applause because I don't think he's slept much in the last three months, actually. And his girlfriend is probably not very happy about him either. But it's not perfect yet, it's very much a work in progress. And who knows, maybe in another 10 years we'll be here with the ultimate sixth sense brain implant. Thank you.

(Applause)

Pattie Maes and Pranav Mistry demo SixthSense Pattie Maes y Pranav Mistry demuestran SixthSense Pattie Maes et Pranav Mistry font la démonstration de SixthSense 패티 마에스와 프라나브 미스트리의 SixthSense 데모 시연 Pattie Maes e Pranav Mistry demonstram o SixthSense Пэтти Маес и Пранав Мистри демонстрация SixthSense

I've been intrigued by this question of whether we could evolve or develop a sixth sense -- a sense that would give us seamless access and easy access to meta-information or information that may exist somewhere that may be relevant to help us make the right decision about whatever it is that we're coming across. And some of you may argue, well, don't today's cell phones do that already? But I would say no. When you meet someone here at TED -- and this is the top networking place, of course, of the year -- you don't shake somebody's hand and then say, "Can you hold on for a moment while I take out my phone and Google you?" Or when you go to the supermarket and you're standing there in that huge aisle of different types of toilet papers, you don't take out your cell phone, and open a browser, and go to a website to try to decide which of these different toilet papers is the most ecologically responsible purchase to make. So we don't really have easy access to all this relevant information that can just help us make optimal decisions about what to do next and what actions to take. And so my research group at the Media Lab has been developing a series of inventions to give us access to this information in a sort of easy way, without requiring that the user changes any of their behavior. And I'm here to unveil our latest effort, and most successful effort so far, which is still very much a work in process. I'm actually wearing the device right now and we've sort of cobbled it together with components that are off the shelf -- and that, by the way, only cost 350 dollars at this point in time. I'm wearing a camera, just a simple webcam, a portable, battery-powered projection system with a little mirror. These components communicate to my cell phone in my pocket which acts as the communication and computation device. And in the video here we see my student Pranav Mistry, who's really the genius who's been implementing and designing this whole system. And we see how this system lets him walk up to any surface and start using his hands to interact with the information that is projected in front of him. The system tracks the four significant fingers. In this case, he's wearing simple marker caps that you may recognize. But if you want a more stylish version you could also paint your nails in different colors.

And the camera basically tracks these four fingers and recognizes any gestures that he's making so he can just go to, for example, a map of Long Beach, zoom in and out, etc. The system also recognizes iconic gestures such as the "take a picture" gesture, and then takes a picture of whatever is in front of you. And when he then walks back to the Media Lab, he can just go up to any wall and project all the pictures that he's taken, sort through them and organize them, and re-size them, etc., again using all natural gestures. So, some of you most likely were here two years ago and saw the demo by Jeff Han or some of you may think, "Well, doesn't this look like the Microsoft Surface Table?" Então, é provável que alguns de vocês tenham estado aqui há dois anos e assistiram à demonstração de Jeff Han ou alguns de vocês pensem: "Bem, isso não se parece com a Microsoft Surface Table?" And yes, you also interact using natural gestures, both hands, etc. But the difference here is that you can use any surface, you can walk to up to any surface, including your hand if nothing else is available and interact with this projected data. The device is completely portable, and can be ... (Applause)

So one important difference is that it's totally mobile. Another even more important difference is that in mass production this would not cost more tomorrow than today's cell phones and would actually not sort of be a bigger packaging -- could look a lot more stylish than this version that I'm wearing around my neck. But other than letting some of you live out your fantasy of looking as cool as Tom Cruise in "Minority Report," the reason why we're really excited about this device is that it really can act as one of these sixth-sense devices that gives you relevant information about whatever is in front of you. So we see Pranav here going into the supermarket and he's shopping for some paper towels. And, as he picks up a product the system can recognize the product that he's picking up, using either image recognition or marker technology, and give him the green light or an orange light. He can ask for additional information. So this particular choice here is a particularly good choice, given his personal criteria. Some of you may want the toilet paper with the most bleach in it rather than the most ecologically-responsible choice.

(Laughter)

If he picks up a book in the bookstore, he can get an Amazon rating -- it gets projected right on the cover of the book. This is Juan's book, our previous speaker, which gets a great rating, by the way, at Amazon. And so, Pranav turns the page of the book and can then see additional information about the book -- reader comments, maybe sort of information by his favorite critic, etc. If he turns to a particular page he finds an annotation by maybe an expert of a friend of ours that gives him a little bit of additional information about whatever is on that particular page. Reading the newspaper -- it never has to be outdated. Lendo o jornal - ele nunca precisa estar desatualizado.

(Laughter)

You can get video annotations of the event that you're reading about You can get the latest sports scores etc. This is a more controversial one.

(Laughter)

As you interact with someone at TED, maybe you can see a word cloud of the tags, the words that are associated with that person in their blog and personal web pages. In this case, the student is interested in cameras, etc. On your way to the airport, if you pick up your boarding pass, it can tell you that your flight is delayed, that the gate has changed, etc. And, if you need to know what the current time is it's as simple as drawing a watch -- (Laughter) (Applause) on your arm. So that's where we're at so far in developing this sixth sense that would give us seamless access to all this relevant information about the things that we may come across. My student Pranav, who's really, like I said, the genius behind this. (Applause) (Standing ovation) (Aplausos) (aplausos)

He does deserve a lot of applause because I don't think he's slept much in the last three months, actually. And his girlfriend is probably not very happy about him either. But it's not perfect yet, it's very much a work in progress. And who knows, maybe in another 10 years we'll be here with the ultimate sixth sense brain implant. Thank you.

(Applause)