Select Page

Original & Concise Bullet Point Briefs

The Revolution Of AI | Artificial Intelligence Explained | New Technologies | Robotics

Revolutionary AI: Autonomous Cars and Robots Set To Transform Our World

  • Artificial Intelligence (AI) is advancing rapidly, with self-driving cars, drones and androids already making an impact
  • AI can learn from experience like humans do, but it must be programmed to make decisions on the fly
  • Scientists at Carnegie Mellon University are working to develop self-driving cars and robots which can navigate their environment autonomously
  • Machine learning allows computers to rewrite their own programming based on interactions with the world, becoming smarter on their own
  • Robots like R1R2 have been designed for search and rescue missions in dangerous environments, navigating without a map.

Robots Unveil New Technologies to Explore Unexplored Environments

  • Robots are using a variety of technologies, such as light detection and ranging (Lidar) and wireless networks, to explore unknown or unexplored spaces
  • Steve Willitz shows how an R2 robot can detect a person in a search and rescue scenario by creating a map with data sent back to home base
  • Jason Darinik of XN Technologies is working on an autonomous drone that can fly itself and map its environment as it goes
  • Vijay Kumar at the University of Pennsylvania is developing technologies for swarms of drones to perform tasks cooperatively.

AI-Driven Robots Revolutionize Drones, Agriculture, and Job Creation!

  • Vijay’s apprentice Dinesh Takur is demonstrating how drones can use visual tags to reference their position in space and work together as a collective entity
  • AI driven robots are being developed to help with precision agriculture, protect the environment, boost food supply and assist with pollination
  • Future robots will be able to autonomously coordinate flock movements, deploy at the first sight of an oil spill or map much faster than single drones
  • Artificial Intelligence is expected to create jobs instead of replace human labor
  • Julie Shaw is leading research on how machines can learn from humans by becoming immersed in an environment and observing tasks done by humans
  • The robot was taught by ankit shah to set a table by demonstrating it and then observing it for two weeks
  • The robot software has learned what each object is and is dynamically thinking about where each item should go.

Robotics Revolutionize Factory Productivity with Human-Machine Cooperation and Artificial Consciousness

  • Robotics have come a long way in terms of artificial intelligence, from being able to make predictions to having the ability to anticipate human actions
  • Abby is an industrial robot that can work safely with a human partner in a factory
  • Artificial Intelligence has developed so far as to enable robots to coordinate and work together with humans, creating increased productivity and accuracy
  • Dr. Hod Lipson believes machines can be sentient, which requires the development of robotic consciousness
  • To build an image of its physical self inside its computer mind, robots need proprioception, the same self-awareness humans gain at around one year old.

Harvard Researchers Push Boundaries of AI to Create Hyper-Realistic Avatars

  • The video is about research into creating AI robots with an internal model of the world that allows them to understand their movements and location in space
  • Barbara Gross of Harvard University pioneered natural language processing which enables machines to understand humans
  • AI is currently focused on narrow tasks but researchers are training AI with human conversations so that it can understand context
  • How Lee’s company Pinscreen is developing AI algorithms to create hyper-realistic digital avatars
  • Creating believable faces is one of the hardest things to bring to the virtual world.

Exploring the Ethical Implications of AI-Driven Technology

  • AI technology can be used to create virtual living beings and superimpose faces onto them in real time
  • Realbotics is developing Androids with friendly faces that people may feel comfortable with
  • AI chat bots are able to have natural conversations, detect emotions, and provide companionship
  • Drones are a potential threat with their ability to be weaponized
  • There are ethical considerations for the use of AI-driven technology, such as privacy concerns.

The Benefits of Technology: Unlocking Human Potential and Positively Impacting Daily Life

  • Technology has the potential to provide tools that can augment intelligence, make better decisions and give better insights about the world
  • Robots will eventually exceed human capability in a certain amount of time
  • Computers can be programmed to carry out one’s vision
  • And this technology has the potential to positively affect daily life if used correctly.

Original & Concise Bullet Point Briefs

With VidCatter’s AI technology, you can get original briefs in easy-to-read bullet points within seconds. Our platform is also highly customizable, making it perfect for students, executives, and anyone who needs to extract important information from video or audio content quickly.

  • Scroll through to check it out for yourself!
  • Original summaries that highlight the key points of your content
  • Customizable to fit your specific needs
  • AI-powered technology that ensures accuracy and comprehensiveness
  • Scroll through to check it out for yourself!
  • Original summaries that highlight the key points of your content
  • Customizable to fit your specific needs
  • AI-powered technology that ensures accuracy and comprehensiveness

Unlock the Power of Efficiency: Get Briefed, Don’t Skim or Watch!

Experience the power of instant video insights with VidCatter! Don’t waste valuable time watching lengthy videos. Our AI-powered platform generates concise summaries that let you read, not watch. Stay informed, save time, and extract key information effortlessly.

this is what the future of hyperintelligence looks like most people nolonger own cars instead artificialintelligence operates fully electricNetwork self-driving Vehicles as aresult air pollution and trafficcongestion plummet across the planetself-navigating aerial drones are on thefront lines for Disaster Response andsearch and rescue missions most peoplelive and work side by side withself-aware Androids these AI companionsboost productivity and liberate humansfrom tedious tasks completelyrevolutionizing Modern Life[Music]feel like I'm in a superhero movie todayscientists are blazing a trail to thisvery future the fact that we're enablingthe system to make its own decisions Idon't even know where to begin with thatI want to know what breakthroughs arebeing made it's talking and it's havingthis Dynamic conversation with youthat's the Wonder machines can beself-aware in ways that we can't thatwill Forge the future too oh my goshit's looking at mehyper intelligence[Music][Music][Music]foreign[Music]Bigler as an engineer and neuroscientistin training I'm obsessed with artificialintelligenceas a kid my father took me to Tech andRobotics trade shows where I becamedazzled by science every year theinventions were smarter and smarterartificial intelligence has come a verylong way in the last couple of yearsmost AI Technologies are programmed tothink for themselves to learn fromexamples kind of like simulating humanintelligence in a way that it learnsfrom past experiencebut how does AI actually workin the future will AI achieve humantraits like emotion Consciousness oreven Free Will and how will humans androbots work togethertoday the clearest road to the future isthe self-driving carunlike a regular car which is just amachine a self-driving car is a robotthat can make decisionsin the future will every car on the roadbecome driverlessto find out I've come to a hotbed ofself-driving car research PittsburghPennsylvania every single person hasstarted to have conversations aboutself-driving cars because essentiallythey're the futurebut in order to understand it we have tolook under the hood making decisions onthe Fly even simple ones like these doesnot come easy for computersto discover the inner workings I'mmeeting a true Pioneer in the fieldplease get it thank you Dr Raj Rajkumarof Carnegie Mellon UniversityCarnegie Mellon is the birthplace ofself-driving car technology thanks inlarge part to the work of Raj and hiscolleagues they've been the leadinginnovators in this field for more than25 yearsso how does his self-driving car makedecisions to safely navigate the worldlike a human driverum should we get started yes we cansince Raj is distracted by ourconversation for safety reasons thestate of Pennsylvania requires anotherdriver in the front seat to monitor theroadthis is so cool I'm nervous but excitedwith the longest you've ever driven avehicle autonomously oh we have gonehundreds of miles awesomeI'm going to go Auto by pressing thisbuttonoh my goshit really is driving itselfWhile most self-driving cars are builtfrom the ground up Raj just bought aregular used car and hacked it withpowerful onboard computer systems makingit more adaptable than other regularcarswe installed a bunch of sensors in themit is able to shift the transmissiongear it is able to turn the steeringwheel apply the brake pedal and the gaspanel it's really a software that runson the computers that makes thiscapability really practical and thereare some very key fundamental artificialintelligence layers that tries to mimicwhat we humans doto mimic human decision making mostself-driving cars use a combination ofcameras and advanced radar to see theirsurroundings the AI software Comparesexternal objects to an internal 3D mapof streets signs and transportationinfrastructure the map is something thatis static in nature traffic and peopleand objects are dynamic in nature thedynamic information it figures out onthe Flycomprehending Dynamic information allowsit to understand where it is headingobjectively in space and react tochanges and traffic signalsaha it recognizes the stop sign yesexcuse me a pedestrians we definitelyshould not be honest with that personthe AI challenge to make a vehicle driveitself is not an easy tasksafety is priority number one and partynumber two and number three as wellbut what happens when the AI systemdoesn't understand specific objects inits surroundingsa pedestrian in Tempe Arizona was killedlast night by a self-driving taxi it'sbelieved to be the first fatality causedby an autonomous vehicle this tragicaccident happened because a self-drivingvehicle didn't recognize something inits environmenta jaywalker in the future Advancedself-driving cars will have to make lifeand death decisions on the fly ifavoiding the jaywalker means crashinghead-on with another car potentiallykilling the driver what should it choosehow will scientists address Monumentalproblems like thesethe first wave of artificiallyintelligent robots were programmed byEngineers with static sets of rules toachieve their goals these rules arecalled algorithms but not all rules workin all situations this approach is veryinflexible requiring new programming toaccomplish even the smallest changes inany given taska new approach called machine learninghas changed everything with machinelearning computers can absorb and useinformation from their interactions withthe world to rewrite their ownprogramming becoming Smarter on theirownyeahto see machine learning in action I'mmeeting another Carnegie Mellon team atan abandoned Coal Mine Dr Matt Traversleads a group that won a challengingSubterranean navigation competition heldby the department of defense's researchagency DARPAthey're affectionately known as R1 R2and R stands for robotthese robot twins are designed forsearch and rescue missions too dangerousfor humans and unlike the self-drivingcar they operate without a mapto achieve this they have to learn toidentify every single object theyencounter on the Flythey are programmed to go out andactually act fully autonomously and theywill be making 100 of their owndecisions so they're recognizing objectsthey're making the decision of where togo next where to exploreto see this in action the R2 robot isstarting on a simulated search andrescue mission to find a stranded humandummy in the mineimagine having a map of a collapsed minebefore you sent a team of people to gorescue someone in that mine right likeit's a game changerhow the robot discerns elements in thisenvironment parallels how an infantlearns about her environment athree-month-old uses her senses tocognitively map out her environment andlearn to recognize her parents sheultimately uses this map to interactwith everything in her worldjust like this robotokay so we ready to rollartificial intelligence makes thislearning curve possible but how does itcreate its own map and identify a humanon its own and without an internalmapping system like the internettest engineer Steve willitz shows me howthe R2 robot can detect a surroundedpersonwhen you're in a search and rescuescenario that's kind of situation whereyou'd want to deploy one of these as itexplores and maps The Cave it dropsdevices called signal repeaters tocreate a Wi-Fi network Trail it dropsthose just like breadcrumbs along thepath using this network the robot sendsdata back to home base to create a mapat the same time the robot must look atevery single object to identify thestranded human so the lidar system isgiving a full laser scanlidar stands for light detection andrangingsimilar to its cousin radar which usesradio waves lidar systems send out laserpulses of light and calculates the timeit takes to hit a solid object andbounce backthis process creates a 3D representationof the objects in the environment whichthe onboard computer can then identifythis process is similar to how the eyefeeds visual data to the brain whichthen recognizes the objects by tappinginto our pre-existing knowledge of whatthings look likefully understanding its environment R2can then make better decisions aboutwhere to go and where not to gowhat are robots doing right now isexploringso the robot came to a junction and offto the left it could see that itcouldn't get past right so it saw theopening to the right and that's where itwent[Music]foreign it kind of looks like it'smaking decisions about whether or not toclimb over these planks and obstaclesall up in this area right that's exactlywhat it's doing at this point just likea baby R2 learns through trial and errorit's like a little dog wagging its tailbut there's no one here to rescue so itmoves onas R2 continues to map out the mineoh my Goda human it stumbles upon its intendedtargetthat is Randy rescue Randyhello rescue Randyscared mewith the discovery of rescue Randy theR2 robot can not only alert emergencyPersonnel but also give them a map onhow to find himthat is incredible it knows what it'sdoingthese incredible rescue robots areclearly Paving the path to the future ofhyper-intelligencein the future autonomous explorationVehicles perform search and rescuemissions in every conceivable disasterZone even in Avalanches atop MountEverest incredibly intelligent off-roadvehicles are also mapping deep cavesystems previously unknown to sciencediscovering a vast supply of rare Earthelements essential for modern technologyartificial intelligence will clearlysave human lives in the future butthere's a lot of terrain on Earth that'stoo difficult to navigate on wheels howwill intelligent robots make their wayover rainforest bodies of wateror even mountaintopsin Philadelphia Jason darinik of xnTechnologies is working to overcome thisproblemwhat we focus on is autonomous aerialrobotics to enable drones to safelynavigate in unknown or unexplored spacesJason's team has built the firstindustrial drone that can fly itselfand map their environment as they go wefocus on all aspects of autonomy whichincludes perception orientation ofduring flight motion planning and thenfinally control but going from twoDimensions to three dimensions requiresan increase in artificial intelligenceprocessingthe mission for their drone is to flyindependently through athree-dimensional path from one end ofthe warehouse to the otherstarting mission three two onenow to mess with its computer mindJason's team places new and unexpectedobstacles in its pathwill the Drone recognize theseunexpected changes will it get lostwill it crashessentially we have a gimbled lidarsystem that allows the vehicle to painta full 360 sphere around it in order tosense its environmentlike the robot in the mine this aerialrobot uses lidar to seeit actually generates a voxelizedrepresentation of the space which yousee here and each one of these cubes inthe space it's trying to determinewhether that cube is occupied or whetherit's free space[Music]of where to go based on its visual inputkind of like us humansincredibly the Drone recognizes thewhite boards and flies around themone of the things about this system thatmake it particularly special is thatit's actually being used in the realworld to keep people out of Harm's Way[Applause]they're already at work in hazardousIndustries like mining construction andoil exploration they safely conductinspections in dangerous locations andcreate detailed maps of rough Terrainfrom a technological perspective thefact that we're able to do everythingthat we're doing on board self-containedand enabling the system to make its owndecisionsI don't even know where to begin withthatself-flying robots like these willrevolutionize search and rescue andDisaster Responsethey could also transform how packagesare delivered but there are limits towhat large single drones can do morecomplex tasks will require teams ofsmall Nimble autonomous robotsDr Vijay Kumar at the University ofPennsylvania is working with swarms ofdrones to perform tasks like play musicor build structures cooperativelyhe's also developing Technologies totackle some very big problems includingworld hunger in a couple of decadeswe'll have over 9 billion people to feedon this planet of course that's a bigchallenge to take on a task this bighe's building an army of small flyingrobots with the ability to synchronizewe think about technologies that can bemounted on small flying robots that canthen be directed in different ways likea flock of birds reacting to a predatoror a school of fish you have acoordination collaboration and it allhappens very organically using AI to getrobots to work as a coordinatedCollective group is a daunting taskthree five years ago most of our robotsrelied on GPS like sensors today we havethe equivalent of smartphones embeddedin our robots and they sense how fastthey're going by just looking at theworld integrating that with the inertialmeasurement unit information and thengetting an estimate of where they are inthe world and how fast they're travelingThis I Gotta See and I'm gonna check itout virtually as a robotI'm at UPenn remotely in Vijay Kumar'slabsample my surroundingsoh I hit something hello hi vijay'sApprentice Dinesh takur is my guidetoday we are going to show robotsplaying in a formation great can we seehow that works sure yeah the first stepDinesh takes in coordinating the dronesis to provide them with a common pointof reference in this case a visual tagsimilar to a basic QR code[Music]using only the onboard camera thesedrones reference the code on the tag andvisualize where they are in spaceusing sophisticated bio-inspiredalgorithms the drones then figure outwhere each other drone is within thecollective swarmthese drones are communicating with oneanother right yeah right now they'recommunicating over Wi-Fi so coolfuture versions of these drones willcreate their own localized wirelessnetwork to communicate but for now thisswarm is a proof of concept you'vedefined a formation and then they'reassuming that formation yeah I just sayI want to form a line and the dronesthemselves figure out where they shouldgo once they figure out where they arein relationship to each other they canthen work together to accomplish ashared goal like ants working as acollective entity once they cancoordinate between each other we cansend them out and doing specificmissions that's really coolswarms of flying robots have theiradvantages unlike a single droneself-coordinating swarms can performcomplex operations like mapping muchfaster by working in parallel andcombining their data and losing onedrone in a swarm doesn't Doom the wholeoperation[Music]the Jay imagines employing his Advancedswarm technology to work on farms thisPrecision agriculture will help feed theworld's growing population we'd likerobots to be able to roam farms and beable to provide precise informationabout individual plants that then couldbe used to increase the efficiency offood production that would be a hugeimpact in a worldthis is our duty as responsible citizensand as responsible Engineers thehigh-flying approach towards resolvingthe problems of the future is definitelya pathin the future artificial intelligencecoordinates flocks of drones to protectthe environment and boost the foodsupplyto combat the negative effects ofclimate change on agricultural cropsrobotic bees assist with pollination inOrchards and on farms making them moresustainable and productivefish-shaped underwater robotsautomatically deploy at the first sightof an oil spill these drones create abarricade to rapidly contain spillssaving marine life and oceans across theworldmodern society has a long history ofbuilding robots to do work that'sdangerous difficult or too repetitivefor humans AI is poised to automate allkinds of tedious work ranging fromfactory work to taxi driving to customerservicewhile some are worried that smart robotswill replace human labor that's notnecessarily the case as a sectorartificial intelligence is expected togenerate 58 million new types of jobs injust a few years so what will the futureof human robot interaction mean for ourwork and livelihoodsI'm at the Massachusetts Institute ofTechnology to meet Dr Julie Shaw she'sleading groundbreaking research in humanrobot collaboration my lab Works todevelop robots that are effectiveteammates with people Julie and her teamare creating software that helps robotslearn from humans even giving theminsight into different human behaviorsby being aware of real people robots candirectly work and interact with them howdo you teach these robots or machines todo these human-like tasks the first stepas it would be for any person the firstthing they do is become immersed in theenvironment and observe and then we needan active learning process the robotneeds to be able to communicate or showback to the person what it's learned wedon't want the robot to learn the directsequence of actions we want the robot tolearn this more General understandingthat's ultimately like our challenge butgetting a robot to grasp the biggerpicture Concept in order to understandthe basic of its task in the first placerequires a lot of observation and wellhand-holding my research is focusing ontrying to make robot programming Easierby trying to teach robots how to dotasks by demonstrating them Julie'scolleague Ankit Shah shows me how thisrobot is learning to set a tableso this is all the silverware and theplates the Bulls the cups and this isthe table that it has to set yes that iscorrect okayas any parent knows the first step inhelping a child to learn is to model thedesired Behavior it's the same withmachine learningin this case the AI robot recognizes theobjects with a visual tag similar to aQR code and for two weeks it observesunkit setting a table so did you pick upan item and then place it on the dinnertable that's basically what we did andbased on that the robot learns what itmeans to set it in a table Dynamic taskslike setting a table or doing laundryare easy for humans but incredibly hardfor robots the software has difficultywith so many variables and even subtlechanges in their environment can throwthem off one of the things which I liketo do is to actually hide some of theobjects so it's not going to see thespoon and the reason we do this is wewant to show that the robot is robust tosome of the disturbances in the taskthe robot software has learned what eachobject is and where it goes now let'ssee if it's learned the concept and canthink dynamically to set the tableso you can just pick up the card here wegoI've revealed the spoonincredibly the robot recognizes thespoon and instantly places it next tothe Bowlthis reveals that the robot has learnedthe concept and executes the rightaction dynamicallyin the process the software iscontinuously writing and revising itsown computer code basically it'slearningforeignif like humans robots can grasp thebigger picture context and not just themathematical tasks will AI driven robotsof the future spell the end of having towork the key aspect is not developing AIto replace or supplant part of the humanwork but really interesting how we fitthem together as puzzle pieces peoplework in teams to build cars to buildplanes and robots we need to be aneffective team member it's real teamworkas if you're in a basketball game youhave your goal right and you have tothink spatially who am I going to passthe ball to and at what time you do thatso that everything matches up theanalogy of a basketball team isoutstanding because we actually need toknow spatially where they're going to beand the timing is of critical importanceand so we need to develop the AI for arobot to then work with usone of the most difficult aspects ofcreating hyper-intelligence is actuallysomething that even we humans sometimesget wrong and that is anticipationanticipating what a teammate orco-worker might do requiresunderstanding of contextual informationon a much more sophisticated level andpredicting what will happen nextcan robots make predictions asaccurately as we canAbby's are industrial robotit's giving this Abbey machine theintelligence necessary to help itanticipate a human co-worker's actionthis is a simulated manufacturing testthat we have set up okay to simulatesome sort of task that a person or robotcould visibly work on togetherfor safety reasons actual human robotinteraction is at present fairly minimaltypically in a factory you would seethese guys behind a metal cage and youwouldn't have people working with themso what we're trying to do is makesomething that a person could safelyinteract with what is human enrollto do together in this task on this taska person is placing Fasteners in somesurface of a plane and a robot applyinglike a sealant over to seal in okay canwe see it happensure the robot must first be able to seeand recognize the actions of his humancounterpart and adjust to the person'severy move ooh I feel like I'm in asuperhero movie so the camera is in theroom can see these lights and track yourhands so that your hand doesn't get cutoff by the robot that's right yeah sothe cameras and the lights basicallywork as ice for the robot so that's howthe robot knows where I am the monitorshows the visual representation of theroom that's inside the robot's mindso this is what the robot might be doingif you know I'm not in this way and therobot's just stealing and I'm notsupposed to be hereput my hands in a robust wayby quickly understanding this HumanAction the AI software reactsaccordinglyby stopping it's important to be able toshare the workspacebuilding on this sense of teamwork pem'snext step is helping Abby anticipatewhere he will move next based on subtlecontextual cuesso in this case the robot will not onlytrack which actions I've done so far butalso anticipate which portion of thespace I'm going to be using okay andwhen it's planning its own motions it'llavoid those locations so that we canmore work together together so whatyou'll see now is after I place thisBolt the robot is going to predict I'mgoing to go to this one nextso what you'll see is it'll behave in adifferent way so now that I place thisbuild the robot kind of takes a moreroundabout path that allows me and therobot to work more closely together andI don't have to kind of worry about itcrashing into me because I can see thatit's trying to avoid meso similarly Linda's side replace thisBoltlet's get a robot take some more kind oflike roundabout path yeah because you'regoing to go there now slow down becauseI'm close to me rightwork together at the same time so notonly is the interaction more efficientin that the robot's not spending toomuch time standing still[Music]it's safer because the robot's notconstantly kind of almost hitting me andalso feels nicer for the person workingwith the robot I really love this themeof teamworkprogramming robots to coordinate with usand anticipate where we will move won'tonly revolutionize the workplace but itwill also change society at Largein the future the coordination of menand machine is so Advanced that thiscollaboration increases productivity andaccuracy in most Industries AI robotsnow accompany surgeons in hospitalsacross the globe they anticipate thedoctor's needs and hand them theappropriate medical tool just beforeit's neededthis dramatically reduces surgery timesand human errorand will artificial intelligenceactually surpass human intelligencewhile some machines have exceeded humanability in games like trivia now we cometo Watson in chess these AI systems weredesigned to master just a single skillthese programs use brute force computerprocessing power and specially tailoredsoftware to beat their human opponentsto achieve the Holy Grail ofhyper-intelligence scientists mustdevelop systems with flexible human-likeabilities to both learn and think thisform of smarts is called artificialgeneral intelligenceI'm back in New York City on my owncampus at Columbia University to meetwith Dr hod Lipson hod's lab isdeveloping creative robots that paintoriginal artworks self-assembling robotsand even robots that learn about theirworld without human assistancebut his ultimate goal is even moreambitiouscan a machine think about itself Can ithave free will I believe that in factmachines can be sentient could beself-aware in ways that we can'tas a neuroscientist I know we've onlyscratched the surface of our scientificunderstanding of how Consciousness Worksin humanshow could one possibly use computer codeto put this Transcendent feature into arobotour hypothesis is actually very simpleit is that self-awareness is nothing butthe ability to simulate oneselfto model oneself to have a self-imagethe first step towards creating roboticConsciousness is to teach the softwareto build an image of its physicalmechanical self inside its computer mindwe humans take Consciousness like thisfor granted even in simple moments likeunderstanding our own image reflected ina mirror humans start to developawareness of their own emotions andthoughts around the age of one thishelps babies understand their self-imagein their minds and it helps them tolearn about their environment and theirrole in itwhen a robot learns what it is it canuse that self-imageto plan new tasksboth humans and robots awareness aboutthe ethical self is calledproprioceptionneuroscientists sometimes call thisself-awareness of our bodies a sixthsensewe use the same test that a baby does inhis crib a baby moves around flailsaround moves its arm in ways that lookrandom to us but they're not random andthen it touches its nose right now if itbrain predicts that it's going to feelsomething and it actually feels thatthat means that its self-image wascorrect same thing happens with therobot if proprioception can be developedto the same level as humans this couldlead to robotic consciousnesshot's colleague Rob quiatkowski is theproud parent of a brand new baby robotthat he builtand by interacting with its surroundingsit's in the process of developing itsown internal self-image so what arethese claws so these are actually feetthey're designed for walking on carpetbut as of now it doesn't really walkit's still kind of a baby needs to learnhow the world works first what do youmean it's still kind of a baby what doesit dolike a baby it's sending completelyrandom actions to each of these robotarms and really try to get anunderstanding of itself primarily itlooks like a spider that that doesn'tknow how to use legs yeah I guess that'spretty good way to put it yeahso it will really be learning by doingthis babbling for somewhere on the orderof a day to a week it will process thisdata to create an informative model ofitself and from there imagine how itwould walk and then execute that walkingin the real worldthese are the first baby steps towardsdeveloping its self-image and like ababy it will eventually learn to walk weknow this because an earlier version ofthis robot using the same techniquelearned to walk after 100 hoursbut walking won't by default lead torobotic consciousnessand that's why self-awareness is socrucial this is a robot which we'vetaken to calling a self-aware robot itis self-aware pretty much in a literalsense that it is aware of itself itslocations in space and its Dynamics asto how it moves okay so it kind ofunderstands its own movements and whereit is in space how does it do that byleveraging this technique which hasbecome popular in recent years calledDeep learning deep learning is a form ofartificial intelligence that like thehuman brain learns through raw dataunsupervised and without structure deeplearning gives machines the ability toexperience and process reality like usRob has devised an experiment to testwhat this robot knows about its worldyou can think of it as if you're lookingat these red cotton balls you have someidea in space as to where they are rightnow if you were to close your eyes andtry to pick them up and put them in thiscupso obviously it's not a trivial task nobut it's not the most difficult task inthe world because we have a goodproprioception we have this good modelof yourself you know where your arm isin space relative to other things thatyou see in spacebut for this robot there's a catch sowhere are the cameras so there's nocamera it's none as if you were to closeyour eyes you know the locations at thestart and it's picking it up and placingit completely blind furthermore it wasnot given a map or any formalinstructions the robot simply has tofeel its way through the task all rightlet's see it give it a shotfirst the robot learned how to use itsarm through trial and error developing asense of proprioceptionby exploring its surroundings itgenerated an internal representation ofthe world and its place in itthe robot is using only its internalimage of the external world to maneuverits arm to pick up all nine balls andplace them in the cupI'm not sure that's something I could dowith my eyes closedhuh it's really just based off ofunderstanding where you are in spaceyeah that's rightcreating AI robots that have an internalmodel of their world is an importantstep towards machine self-awarenessself-awareness is sort of a similarpropriet sensitive capability butapplied to mental thinking so if theythink about thinking they think aboutwhat they are because once you can dothat it means you can plan things intothe futureonce robots become self-aware they willneed Advanced ways to communicate hitwith humans keyboards and screens areinadequate for complex thoughtsrobots will need to learn to speak andhave natural conversationslike a baby who listens to those aroundher and learns to talklaying the groundwork for this kind ofhuman machine interaction is pioneeringscientist Barbara gross of HarvardUniversity her seminal work in what'scalled natural language processingdirectly led to the development of voiceactivated artificial intelligence youknow like Alexa or Sirinatural language processing actuallypredates artificial intelligence andstarted with machine translation effortsability for a computer system toOK spoken dialogue with a person hasbeen a long-standing goal of artificialintelligence research from its Inceptionand it turns out this is a challengebecause when you speak what you sayreally depends on the context in whichyou say it another challenge is themeaning of words can change depending onhow they are delivered so one example isthe contrast between saying that'sfabulous they're that's fabulous alsowhen we have a conversation we Markparagraphs at the beginning with a risein intonation in a fall at the end sothere's a whole way the speech signaltells you something about the contextand something about the intended meaningBarbara's early research led to methodsfor programming computers to understandthe meaning of spoken language by usingClues from a person's tone and contextso let's Flash Forward the speechsystems are amazing now because thereare lots of recordings of peoplespeaking that they can build theirsystems on as a result AI has gottenmuch better at understandingstill room for improvement the systemsthat do exist are pretty much focused onvery narrow tasks this takes Siri andAlexa as examples they're mostlyoriented around a single question or asingle request and they presume thatanybody will stay within the range ofbehavior that the designers imaginedso researchers are turning to machinelearning by training AI with hours andhours of human conversation they canlearn to better understand the contextof how humans Conversefuture versions of this technology willallow us to have natural conversationswith our computersone of the things that's amazing to meis that her Fields have succeeded sowell that there are devices out in theworld that people use every day I neverdreamed that would be the case in mylifetimehyper-intelligent natural language AIwill change the way we interact with ourcomputers and robotsbut this advanced technology will neverreach its full potential as humancompanions until it looks convincinglylike usI'm in Los Angeles to meet how Lee hiscompany pinscreen is giving AI a humanfacethey're developing cutting-edgetechniques to create hyper-realisticdigital avatars in an instant one of thehardest things to bring to the virtualworld are humans right and specificallyfacesto create believable faces how isrelying on complex AI algorithms it's anartificial intelligence that actuallydigitizes yourself into the computer byjust looking at a single image or youknow partial information and it's notjust a 3D model static one but it's onethat can also be animated and brought tolifeother methods of generating life likeavatars need to capture multiple anglesof a faith and motion and they can takehours to render but not pin screenstechnology I can show you real quick howthis works do you want to see yeahincredibly his software also allows himto superimpose any face he wants in realtime so if I do this this blue face isbasically a face tracker so in real timeit's actually modeling my face in 3D soif I move around my face the blue maskisn't basically a three-dimensionalrepresentation of my face wow it's kindof like a green screenlike a Hollywood CGI film the computerdynamically models house faith andtracks his movement on the Flyclick on himturn myself into Putin wowit's basically generating the wholething in real time right now oh my GodPutin's talking to me right nowpolitical leaders aren't the only thingHal can generate[Music]Audrey Hepburnit's generating all the pixels in realtime these teeth are never seen in thispicture so it's predicting what yourteeth would end up looking like thesearen't even your teeth yeah these arenot my teeth it's actually generating ohmy goodnessHowell believes that software like thiswill give a more human face to thedigital worldultimately this will result infriendlier looking Androids and evenvirtual beings I have been hanging withmy dog for a while do you have pets Ihave three toy poodlessomeday we're actually going to interactwith virtual beings that are going toassist us in our life imagine instead oftalking to a Serial Alexa you're talkingto a face right and it's the best way tocommunicate is to have a face-to-facecommunication an AI provides you perfectcompanionshipthis kind of Technology will give AI afaith that most people can relate to areyou human or are you artificialintelligence that is a very interestingquestion I think I am human but I amartificial intelligencehyper-intelligent companions could Usherin a more helpful and hopeful world[Music]high powered virtual beings that looktalk and even think like real humans arecommonplace these holographic assistantstake care of many aspects of daily liferanging from fashion advice to businessconsultation their faces and wardrobescan be customized depending on theirrolewhen a doctor is required these virtualassistants play the part and are alwayson call armed with the latest medicalknowledge they accurately diagnose mostcommon diseases the same technology isalso capable of capturing the imagevoice and life story of loved onesafter death these virtual friends andfamily are always a part of our liveseven if Engineers can create lifelikerobots that look like humans calledAndroids in order for AI to become truecompanions people will need to feelcomfortable embracing these Androidsfiguratively and literallyI'm outside San Diego to meet MattMcMullen the founder of real boticsMatt is building Androids that peoplewill want to embrace physicallythe goal is to create not only a robotbut an AI that are both appealing enoughthat someone would feel like they wereactually getting to know someone notsomethingonce the sculpting and casting iscomplete members of mathon an actual functioning robotthe faces are actually modular the facejust literally comes off yes it does theidea is you create one robotic head anda whole bunch of different charactersthat can all run on that same headsso all of the things that move in theface are actuated by these magnets thatare in skin[Music]programmers use artificialand advanced chat Bots for these robotsthe goal is for them to have naturalconversations with their companionsblanketyeah she's a blinkerit's looking at me hello how are youtodayI'm okayI I'm fine I'm doing just fine how areyouwhy do you ask me thatum you know because I care about yourfeelings[Music]speech is only one aspect of humancommunication facial expressions arehugely important in social interactionso Matt is incorporating this non-verbalcommunication into his Androidsthe vision system that we're working onshe'll be able to look at you and detectyour emotion by the expression on yourface by the temperature of your skin andall these other things comingor is key right yes exactly it looksremarkable it's moving it's talking andit's having this Dynamic conversationwith you that's the WonderI can imagine some people might walk inhere and say oh look at sex robot thething is to make a really impressive andgood sex robot you actually have to makea good robot in the first place but Ithink that the the longer term goals aregoing to be to create these systems forpeople to use in whatever way they seefitwe're creating human-like robots that wethink can be used for a huge variety ofthings for people who are lonely whetherthey're old or maybe they're sociallyisolated or maybe they suffer fromsocial anxiety Androids with a friendlyface could keep the elderly company andmonitor their health armed withartificial intelligence these Androidscould take on other qualitative rolesI think therapy is a huge one using therobot as a safe conduit forcommunication and letting people reallyopen up because they don't feel likethey're being judged by something likethis yeah companion Androids like thiswill Forge a future where nobody willever have to feel alone againwow life like human Androids and virtualbeings have the potential to enhancehuman social interactions there areethical concerns as wellusing artificial intelligence it'spossible to hijack a person's physicalidentity there is one very big problemin the whole thing which is privacy whatif I would do something harmful to youwhen you say harmful you meanreconstruct somebody and have them saysomething that would they would neversay or never do rightdigital Fabrications like this arealready emerging online in what arecalled Deep fakes I can go on yourwebsite take a picture from it and thencreate some content with it without yourconsentalso dangerous swarms of AI drivendrones could be used in terroristattacks can drones be weaponizedof course they can be recognized thesescientific breakthroughs yield resultsagain oftentimes be used against humansso you have to be held accountable forwhat you developed and it's a moralresponsibility to think about thebroader consequenceswhen it comes to ethical considerationslikefeel hopeful that science technology andhuman Ingenuity will find solutions tothese big problems the potential forartificial intelligence to profoundlyimprove Society to improve jobs toimprove health care to improve educationis enormous if we do it the right waytrying to build computer systems thatassist them in doing what they're doingbetter technology is more likely toprovide some tools that will allow us tobecome superhuman augment ourintelligence to make better decisionsand to get better insights about theworld future versions of this technologywill become even more intelligent thanus humans I believe there's no doubtrobots will exceed human capability Imean the path is very clear whether it'sgoing to take 20 years or 200 years thisis maybe the most powerful technologywe've ever inventedpotential ofyou can ultimately probably program arobot or or a computer to carry out yourvisionand in the right hands this technologyhas the potential to radically transformevery part of daily life for the bettera true partnership withhyper-intelligent robots with theirintentions aligned with our own willtransform Humanity for the greater good[Music]foreign[Music][Music][Music][Applause]