Sessions is temporarily moving to YouTube, check out all our new videos here.

Multifunctional, Connected Battle Bots with Augmented Reality Capabilities

Silas Adekunle and Johnathan Quinn speaking at Bristol JS in August, 2017
15Views
 
Great talks, fired to your inbox 👌
No junk, no spam, just great talks. Unsubscribe any time.

About this talk

Robotics + Gaming + Augmented reality = awesomeness. We'll take a look at what Reach Robotics are working on with MekaMon, their gaming robot. Plus, a demo with lots of hands-on opportunity.


Transcript


Good evening guys, thanks for having us. My name is Silas Adekunle. I'm the co-founder and CEO of Reach Robotics. And you've also got Johnathan here, Johnathan Quinn, our Head of Games, he's actually Doctor Jonathan but he doesn't like to be introduced that way. And-- - I don't mind too much. - And the product that you're seeing on here that I will talk about about is called MekaMon, so they are multifuctional connected Battle Bots with augmented reality capabilities. That's kind of the most simplest way to kind of describe everything that they do. So the company has been going for about four years and I do recognise some of the faces in this room so you might have seen us as we've been growing. So initially I studied at UE studied Robotics and it was during my study that I had an idea for gaming robots after a long series of lessons of teaching students about robotics and STEM and things like that. And I kinda, I was disappointed with the status of you know products in the industry. I was really excited going off about robots and everything you saw in sci-fi movies like Transformers and things like that. And then when it comes to real life their all kind of just shit, you know, all clunky and stuff. And it was like, okay, there's got to be a better way to create interesting things and I think it's because I was so naive that I was able to go into this journey without understanding how complex and how crappy it was going to be, but it's been a lot of fun along the way. You know there's -- - We've hired some salespeople since then but -- - Yeah we've hired some sales people since then. So there was no way I could kind of condense four years of lots of adventure but I thought what I would share was kind of what we are working on and some of the challenges that we've gone through and not just stop level but we'll focus a bit on what we've done on the augmented reality side of things as well and hopefully that will give you a flavour of some of the things we've been working on. So, you know, just a quick overview of MekaMon. The idea was to create a video game character in real life and for that what you have to have is a robot that can move in really interesting and unique ways and function and also feel like a character. And then you know the robots need to be upgradeable so I will show you. This is a Mekamon gaming robot. It's got 30 degrees of freedom per leg, so what that means is you got your knee, your thigh, and then your hip as well, so with that, you've got 30 degrees of freedom all together and that gives you really fluid and interesting motion. The legs are removable so what that means is you've got modularity, so I can change out the legs in the future for wheels for example, but I think legs are definitely the most interesting. You can express a lot more personality than say with wheels. So all the legs can come off so fully modular, and they've got a battery sitting in the chest as well. You see a final product here that is now being manufactured. We got a factory in China. The company a few months ago we just closed our series A. We've taken about 10,000,000 in funding so far and the last of the series was 7,500,000 USD. We have come some way. I'll show you some photos later, one photo, there's only two slides, but we'll have some demos of the app and things like that- - And actually it's pretty good. - Yeah they're really good. We've gone through about 30 prototypes or different iterations going from different modes of locomotion to looking at different heads and, John, that was a challenge just narrowing down balance and design and kind of functionality, and John will talk a bit more about that. The robots are modular, so this is one leg I have taken off here. You've got the battery here, and then you can also take off the shields as well. So imagine this as I'm putting on different armour. The theory cornerstones of entertainment and the toy industry are competition, collectability, and customization. If you start to look at the most successful products out there you start to see there are maybe two or three of these things because that's how you get those viral elements, that's how you get creativity, that's how you get people to start to compete against each other. We have all three more or less in this and we are still a young company so we are having lots of fun learning the right triggers to make some of the parts of the business more viable, but at the core of it all, is making sure that we have a unique and interesting product that the World has never seen before. I'll just show you a few of, or a little bit of what the robot can do. Where's the best space, because everyone can't see on the floor. So you got the robot by itself, so the robot needs to be able to stand alone and be fun. We are looking at this as a- So it's scratching itself for example, so you see it's got lots of personality and got lots of character. In robotics and animatronics, if you want to give something character or give something that engagement, you've got to give it massive eyes so it's cute and draws you in kind of like you're looking at Wall-E, Cosmo, if you guys are familiar with some products, or you've got personality, or behaviour, movement, or motion. Everyone can see the screen on my phone? Cool. The robot, we're building this end-to-end system starting with the robot to a game that we built on the phone, so what you have is a platform that combines gaming with augmented reality, so we are the first ones to build that. The platform you can't really see. We created this robot MekaMon on top of it, it's the first product, but to give you an idea of how this can expand in the future working with other companies, you have properties, some of which you've seen in movies and videos, who also want to bring their characters to life, and our technology would be enabling that. Because no one has done this before, we've got to create the robot to actually show that. Talking about the robot by itself, one of the things that we're working on at the moment is without having the app being able to interact with the robot at the moment the firmware isn't there but it's got the capability to do that because it has an IMU so it can, it's aware of it's own self sense and balance, so in the future if I tapped it, it would know or roll over or react. They're going to have behaviours and learn and react, but at the moment it's mostly dumb. Sorry, yeah. This is the app that we built to control it. Like with every video game app, you would download it, be able to have an account, be able to sign in. I'm just going to log in as a guest, and that's syncing with our database and back-end. At the moment we're going through a beta testing so we did a small launch in November 2016, and we manufactured 500 units and that's the first time people were able to- it was really amazing to actually have created something over the last three or four years and for people to then-- We're offline, yeah. And for people to actually then put money down for the product is an amazing feeling. And then I remember just a few months ago when we actually shipped after a tonne of delays. Going from manufacturing delays in China, logistic issues, and the first guy was doing this unboxing video, and the most common feeling across everyone in the office was, "Holy Shit that's so nervewracking", to watch someone going through the product and maybe they're going to absolutely hate it and go, "What's this shit?", and chuck it out the window. But it wasn't like that. People were really looking forward to what we we've been creating and so far- - It was really stressful. - It was really stressful. I think it was a really long video about 20 minutes, this guy just going through and they loved it and it felt really good for the first time. I think the next thing we're looking forward to is at the end of this year we're going to have the product in a store for the first time for the retail launch. I'll probably just be creeping in a corner just waiting for some kid to come up and pick up one and say, mom or dad, I want this, and that's going to be an amazing feeling. Okay we're not connected online, will AR work or? Okay. We'll see. If it doesn't I'll do a little break and John will talk about the challenges while I work in the back. It's super interactive because I won't be here for the whole session, so if someone has questions in between just blurt them at us. You've got a few sections of the app. The first thing I will do is just bind to the MekaMon, and now I'm connected to that one. And the first thing that people can do is go into dropzone, and what we've done here is give you the ability to control every aspect of your MekaMon. I can make it do pretty much whatever I want. I can change the body height because it's pretty low at the moment. I can change the gate. There isn't a lot of room to work here but it seems to be the best place for everyone to see it. So if you're looking, it's hard to look at both places, but you can see from your trackpad, that's forward. That's back. That's left. You can go straight as well. You can rotate on the spot. And you know big deal the robot moves forward back and right but because we've got the complex 30 degrees of freedom in there and this controller, it means we can do some really complex animations and really cool stuff. This is what a dying robot looks like for example. Down to that little shutter and everything. Everyone get's a laugh from that. You've got a robot that can really express itself. We've got a load in there so I can preload lots of animations in there and we're not even scratching the surface of what this guy can do in theory. It's just more when you've created something that's got not infinite but a huge range of capabilities, how do you then even start to make that work? For example, to be able to create these animations what we've had to do was create a model in Maya and then create a tool to take into account some of the friction and joint limits and then turn that into basically joint coordinates for that. That's when you're doing the animations. When you're doing the mathematics you're taking into account your trigonometry and things like that. You've got two modes of controlling it there, and when you start putting the behaviour in you start to have models and filters in place so now a sad Robot should move slower for example so that's the next part we are working on and a behaviour that we want to put in. So I can go from the robot moving relatively fast like that, to saying I'm going to slowdown the speed and it's going to go into creepy stalker mode for example. I can slow it down a lot more than that. You have a robot that can express it's personality in lot's of ways. You've got your video game character so to say and there's a lot more you can do. All of these controls you can change all of them. On the left or the right hand side are different floor. You've got carpet, tile for example, you're able to change and have it adapt to different environments. We don't have it completely autonomous so you can't just leave it to run off by itself. Two reasons, one is because it would increase the cost and increase the complexity of the product at the start of that. I think when you start to make the product smarter, I don't think it's going to be smart enough to the point where it satisfies people's expectations for what a robot should be, so once you enter that zone, you're always going to end up with a dumb robot and it's always going to make mistakes. If you create a kind of a horse and rider situation where it starts a dependence on the user but we've got four transceivers so it can see other robots when you're battling and interacting with other robots, but it can't interact with the World. So it's your responsibility to take care of it. If it falls off the table it's kind of your fault rather than it was supposed to be smart. Those are kind of the reasons and considerations we have had in there. We've got a robot, we've got a personality, we've got a character, now what else do you then do with it? That then becomes the game. You got your robot, but how do you turn it into a gaming robot? When I set out for all of this, the idea was just to have a battle robot, having robots that could fight each other. The price point was $99.00. I learned a lot over three years. It's not possible to get the complex motion that we are trying to do at that price point but we noticed at the bar which is our floor is the robot has to be able to do that and then whatever the price is, we'll work on it but at least we're able to show the World what it's capable of. We are going to release custom engineered versions later and stuff like that, but people can always upgrade, so at the moment this is the $279.00 and that's roughly the price point that we're going to keep it. It's not super cheap. It's a premium product more or less, but it shows the World the potential for products like this, and I think if we show the World all the functionality and products we want to get in there-- then it will be worth the price basically. You've gone into Dropzone and you're seeing the robot. The first thing you can do on the left here is go into Mekamon Universe- Oh yeah I'm not connected to- So in there you've got battle which is when you got two robots who can compete against each other, that's primarily what these robots can do, that's the whole design, the ability behind them so we've got ports on the back they've got shields like that, but as a start up we are also open to what if that's not what most people want to do with it? We had a lot of people requesting well I just want to play with the robot. I'm sure a lot of people just want to chase their cats with it, you know messing about with your cat with it. We started looking at, this doesn't have to be, we have created something so complex and games, for example they move more and more. We're seeing more bridging the gap between the virtual and the real. AR and VR and things like that. There are going to be opportunities to create something that technology that bridges the augmented reality with robotics and gaming. Instead of looking at how do we get this as a platform, so not just robots that battle each other, but looking at the augmented reality as well. Augmented reality gives you an opportunity to have a compelling single level experience. These robots don't move super fast. They don't jump and they aren't super cheap. If you have one by yourself, you can't do much with it apart from once you play with the robot, we don't really have all the behaviour capability that we have in there. You're going to need augmented reality to have that personal experience, which John will mention in a bit, but before I do that I just want to show you what we have in archives. What we have also done is start to create content as well that links deep into the MekaMon World. If you're creating something like this, how many people are familiar with Skylanders by ActiVision? In case you've bought lots of presents for cousins or kids or something like that, it was a landgrab basically. They made a crap tonne of money creating this thing that bridged the physical and the virtual World. But they had to create their own property because there just wasn't something as suitable with so many characters that they could use with this kind of work that we are building. When creating something like this that the World has never seen before, it seems like a mistake to at least not have a shot at creating our own World, creating your own characters so we've started to invest in that. We've actually seen a strong interest from Japan. Can't mention much more than that but potentially that could be quite big for us. We started looking at where did the MekaMon come from? In here the user will be able to go into the World and there's a classic, Aliens came over, took over the World, stuff like that, and then the MekaMon are able to compete and fight back. As part of that we use augmented reality as the story teller mode so if I put the mat down for example you can go into that story and actually learn a little bit more. We've got chapters and if I put the mat down what you can then start to see is, this is your classic static object down and track it. This is not super crazy on the AR side of things but when John starts to show you the game inside, that's where I think the most impressive use of AR is. For example, if I do that, what this means is in the future when real World events are happening based on the story we can show people. You can get closer and see the MekaMon World where the MekaMon are supposed to come from. You can see the Meka-Academy which has a significance in our story World. For example, you can explore the World which is one of the benefits of AR just to make that story a bit more, a bit more real. If something is happening in the World and there's a weapon and an event I want to get a top down view and I want to see it. Then to have my robot interact with the story as well. Someone is attacking the base, let me take my robot that's physical here and take it to the virtual World. When I'm done playing it the robot is still in my hand. I can still touch it repair it and do stuff like that. Next is then- Let me reconnect to it. Battle which I'll let John explain what we have done with battling. - Yeah so a few things that I want to jump in on and mention they are fairly fast. I measured them running at about .67 metres per second so that isn't too bad. When you said they can't job and can go quite slow. We have a video of it jumping. = I don't want to ovesell it too much. - These games the games that we are shipping at the moment is this Battle Mode that we are talking about this battle simulator and really they are prototypes to show what we can do with the technology. One is to show everything we can do without augmented reality, and the other is to show an example given the tech and the games you can build with augmented reality and to sort of simplify that to some extent. They use the same connection across both games. The battle itself to give you an overview is essentially, we didn't want just two robots walking toward each other and someone arbitrarily firing something or smashing a key as hard as they could, and that person who matches the fastest would win the game. - We actually tested it so we got two robots and if we were going to play with two robots, what would you do? They don't have any sort of strategy. The attempt is to run them into each other. We're talking two years ago, we took two robots that are way bigger than this and just rammed them. There are different gamers. In arcade mode we'll have stuff like that which we will mention later, but this is more, you need strategy and you want this to be a longer lasting deeper gaming experience. - Promoting manoeuvrability and then moving around each other is a big goal for us, so we built almost like a collectable card game, a CG built around the game. Here's a preview with lots of filler art that shows you the starts of the individual cards that you can take into your game that have about over 100 parameters or something like that to bind them so it's quite complicated. But yes you take this deck of cards into the game with you and you dealt them at sort of random shuffle, deal them at random and you play a battle against your opponent. It's in real time the battle when you draw the cards. It allows us to put deeper mechanics into the game so you can have one card in your deck that stuns the opponent for maybe like 20 seconds or something like that. Rather than just having attacks you can do constant random elements in this deck construction. A lot of what we do is guess work based on not knowing how fast or manoeuvrable the final robot would be but also trying to promote manoeuvrability and promote movement around the scene around people's bags they put on the floor and around tables and stuff like that. That's pretty much the design concept of that. All of our games are built to be networked from the ground up with LAN or matchmaking servers, so everything like the simulators has a sort of network backbone to it, which is always a good thing when you're building games. I don't know how many people here have worked with games but changing to a multiplayer set up later down the line is quite complicated. I'll do a desk demo simulation- - Yeah cool. - Yeah. We have a calibration process that runs at the moment every time you turn the robot on. It will eventually, I actually did it completely wrong, I wrote this as well so that's pretty stupid. The idea of this is to calibrate the arm use. The last stage was using the accelerometer and that's getting the compass and gyroscope calibrated. The idea with this concept is that you, oh by the way there is also a bug. We released this build with simulation with this game made on a Friday so there is a bug on Apple devices that can hard crash - Yeah - So if that happens then I warned you. We first of all tracked the ground plane and at the moment we use an off the shelf system to track that. It tracks this mark on the floor and natural features in the scene. These features, they are sort of okay actually, they might be useful. At the moment we are looking to hopefully within the next couple of months move toward AR Kit and AR Core for our ground plane detection. AR Core the Android if anyone missed that yesterday is the Android SDK for doing Occulus AR Tracking. You can walk around and track stuff and if it's playing it spots points us. This tech is based on what was available a little while back and so for now this is sort of what we are shipping with. Then you calibrate the tracking for the robot. You can see the UI comes alive and there's a virtual enemy over there so I am going to shoot and get into it, can I just use the sound on there? - Turn it on. - Is it going through the- - Oh yeah the sound is trying to go through the Apple TV, and I'm not sure we have... - So this framerate is not super, you can see what's going on-- - Yeah. You can sort of see I am being shot there by a virtual enemy. You can see -- - Turn. - You can see sort of it shoots me in the direction, my robot reacts in the direction I was shot from. - Okay there you go. Yeah that's the latency. - So its probably not as easy as you can see on the screen as it is on the phone, but you can get the concept of this game. For us one of the biggest challenges with this is the usability of the system. Really building the way the tracking works and the way the user interface works and the sort of user experience that happens. We're doing a lot of testing at the moment. To make sure-- You can also, the building actually blocks movement so you can see me moving towards it there and also, the robot has a virtual collision volume around it so you can't walk through buildings and enemies can't walk through it so its got a solid thing. And sorry our network connection is a bit screwed but - That's alright. - Yeah that's the sort of brief overview of the AR. We wanted to build the robot to, if you guys know about computer vision, there's also a bit of, I researched it called machine vision, and the idea of that is you're doing this with automation and stuff to track grains of rice on a conveyor belt and things like that, but the idea of machine vision is your hardware is built to make the computer vision easy as possible. That's kind of what we tried to do. We build the robot in a way that is the best we could so we can track it. There's a 1,000 different issues that we came into designing the market that we ship with hit, the head shape colour, to the used material of it. The colours important because when you have a lot of ambient light that reflects from even a diffused surface, then the glow of the LED's underneath don't really contribute much let's say if you're out in the Sun. You have to take into account all of these situations to be as robust as possible. This is one of the three printed rigs we had to do head twisting, if you want to pass some out. - If you guys want to, woah. - Yeah literally we have, I don't know if I can tip them on to a chair or something so- - This table? - We did so many tests to try to find the right sort of material and lighting for the head, its ridiculous. I only have probably 10 percent of- - The Prototypes yeah. - Different things that we tried here. Some of these are things that are manufactured in the factory like this really weird white shiny one which is obviously super useful. This was a mould that Dan made- - Yeah it was a silicon mould. This is the ideal- - I used to go shopping to kid stores to find bits of plastic that might work that I can scuff or whatever test the plastics out. It's a real problem because we want to make the tracking robust because the ground plane tracking is super expensive. AR Kits and Six S onward or something like that so it's a really expensive thing, and this tracking is not even doing as much as an AR kit. We wanted to make that part as cheap as possible and we had to interact with the hardware design team to make that possible. - And that's one part of the prototyping process. The head, we had to go through all sorts of different ways of making the robot - - A change to the second slide - Yeah this is the final slide guys. There were lots of things we could go through down to fundraising with shitty prototypes and things like that but I thought it was more interesting to show-- - Yeah, yeah, yeah. - He's talking about the prototypes-- - Oh yeah right Down from here is a cable and gear driven one, this was a finger trap so there was no way you could ship that. Chris still has a blood spot on his finger where he actually got his finger caught in there. This photo was taken two years ago and the number of prototypes we have gone through since then is insane but you guys are all aware, the one thing that everyone understands is if you are going to make something good you have to make it, break it, make it, break it, and that's the process we have done as well. A lazer narrow focus on what we're trying to achieve which is to make something really cool by going through user testing, like a water shedding moment, we sent out an invite to get some feedback and it was great to see people were having fun and were enjoying it. That's why we are doing this. Hopefully when this gets to market people keep loving it and we can implement all the plans that we have in mind. I'll cut it there I think. That gives an overview of what we have done so far. - Can I say something? Two things, two things This sounds really emotional and a lot of back slapping but two of the hardest things I think we did were having this ridiculous idea that you wanted to create a freedom robot the whole time and all of us a whole time saying this can't be done at the price point and cannot really make the software for it, and him insisting that we have a cool product that makes it a lot easier to sell. And our guy that handles all the logistics and manufacturing which is just the most insane thing trying to go through that and - - Yeah - It's been such a nightmare road certifying everything for all the countries we are shipping to. Making sure and creating all the processes for manufacturing and testing, and all the hardware, actually doing the shipping and the logistics and all that sort of stuff. - It's insane. - For a software person it boggles my mind how complex it is it's crazy. - I'm sure when the other hardware related companies come up and talk as well they will tell you all the additional challenges as well. But that's a little about us, Reach Robotics, and MekaMon. Thanks for listening. Thank you.