jarek wilkiewicz:hello, everyone. i'm going to go aheadand get started. welcome to our session. thank you very muchfor coming. i realize there's free drinksafter this, so we're going to try to keep you entertaineduntil then. we'll talk about superchargingyour mobile game with youtube. and i would liketo introduce my co-presenter corey from unity.
he is working on unityimplementation with a number of partners. so i wanted to start withthis photo that i found on the internet. as you can tell, it wastaken a while ago. there's a few really excitedmobile gamers. this photo was taken ata gaming convention in vancouver, canada. the question that popped intomy mind when i saw this is,
what would it take to excite andenergize your gamers, your players, your customersequally today? well, if you look at some stats,there was a study done two years ago. and 95% of gamers that spendsignificant time playing games watch user generatedcontent on youtube. so these are gaming clips. actually, the number fortrailers is 94%. so more people watch usergenerated content than the
highly produced gamingtrailers. in fact, there's severalcompanies already taking advantage of this trend. so, some of these titles thati'll show you will probably be familiar to you. trial xtreme 3, fifa 13 hasvideo upload capability so you can share your gameplay directlyfrom the game, and then the granddaddy of themall, talking tom. this is an application thathas really taken great
advantage of video sharing. so in that app, you can createa video that's both a nice virtual pet, a type ofapp, as well as a self expression platform. so this is what we'lltalk about today. we will show you how you cantake a sample unity game that we build for this purpose, andthen integrate youtube api in order to share the gameplaywith the world. youtube has one billion users,so there's a billion potential
customers out there. and then we'll show you boththe upload and playback capability-- how that can be integratedinto a unity game. so really, the only thing youhave to do as a game developer is focus on buildinga great game. that's it. so first i wanted to show you alittle demo of the game that we have built forthis purpose.
so here's my game, and theobjective of this game is to shoot these gas cans asfast as possible. and i'm pretty bad at it, as youcan tell, but i'll try to do my best. one more. oh. i got 12 points. all right, time is up. so the next step--
i just had this wonderfulscore of 12 points. i would like to watchmyself play this game, so here's a replay. pretty amazing. i rock. so what i do next is, well, idefinitely want to share this gameplay on youtube. so i'm going to hit the youtubebutton and let this processing take place.
what this means to you is,imagine you were playing an awesome game-- some sort of warfare simulationand with your best friend, and you just blew uphis or her abrams tank. you can share that achievementon youtube, and then claim the bragging rights for therest of the week. now, i would like to introducecorey, who will take you through the process of creatinggames with unity, and then he will discuss some ofthe plugging opportunities
that exist within theunity framework. after that, i'll take youthrough the process of actually integration betweenyoutube apis and unity. so over to you, corey. corey johnson: thank you. hello, everyone. my name's corey johnson. i'm a field engineer atunity technologies. this is unity circa last fall.
there i am, just in caseyou were doubting me. i'm going to take you throughwhat unity is, what we're about, and then kind of give youan overview of our editor, just so that we have somecontext when jarek talks about his plugin later. i apologize if you alreadyknow unity. we're going to stay high level,but i just want to go really, really fast. and if you don't know unity, iapologize, because we're going
to go really, really fast. but myself and my illustriouscolleagues are in the sandbox on the third floor in theandroid section, so please feel free to follow up. so what is unity? well, we're a crossplatform engine. we come with an editor,which is a tool to create 2d and 3d content. we believe we have the besttool in the industry.
we believe that we give you arapid learning curve and rapid iteration times. we have a mantra of build once,deploy anywhere, meaning we're, again, multi-platform. our mission statement is todemocratize game development, and that means that we wantanybody out there who wants to build a game-- whether an artistor an engineer-- to be able to build a game. and that attitude has led to ourtooling, and it also led
to an enormous community, whichi'm going to talk about in a second. when i say everywhere,i'm just going to quickly define that. we're currently at 13 differentplatforms, and more on the way. we even have union, which isplatforms that you may not even know you want tobe on yet that we help get content to.
so as you can see, we're fairlyprolific, and if you want to be there, we're probablythere for you. a little bit aboutour community. 1.8 million peopleuse the product. 400,000 monthly activesevery month. 5 million hours of creationevery month. enormous. from the hobbyist all the wayup to triple a studios. one of the things we did todemocratize this is build an
asset store, which is amarketplace for our users to share content, whether it betools that they built in our customizable editor that we justdidn't get to, or if it's awesome artwork thatpeople need. so like, for me, i'man engineer. i can't do art, but i cango in and buy awesome environments or a knightto run around my game. one of the features that we useis called native plugins. we're going to talk a lot aboutit today, so i just want
to give you an overviewof what they are. so unity uses mono forscripting, so that means you can run your game scriptsacross platform. but what plugins let you do iscall native code to whatever platform you're on fromyour game scripts. for this talk, we're going tofocus only on android, and i will point out thisis a pro feature. so first, there's twoways to do plugins. the first is native andthe second is java.
they're actuallyinterchangeable. i'm going to talk about nativefirst, and then a little bit about java. so here is an example of somecode that's like c code that's a minimal plugin. all it does return a value. all i need to do is put this cfile and build it into a .so file using the android ndk, andi need to place it in my project plugins folder,which i'm going to
show you in a minute. to call that code-- and thatcode can be doing, obviously, a lot more complex things-- all i need to do is, from mygame code, use interop services, annotate my functiondefinition, and then call my code. that simple. for java plugins, it's alittle bit different. we use the jni to interactwith your java
class that you build. you can build a .jar file usingeclipse and the android development toolkit. again, you just build a .jarthat contains all your classes, make sure you checkthe is library button, and then place that insideyour plugins folder. we provide-- because you have to do some workwith the jni to discover your methods and theninvoke them--
we provide some wrapper classes,the android jni and the android jni helper. and on top of that, we buildanother layer of helpers, which is android java object andandroid java class, which allow you to not only automatethe whole process, but we also cache those lookupsso subsequent calls are a lot faster. so instead of showing you javacode for the plugins, i'm actually just going to showyou an example here.
and then jarek's exampleis going to show a lot of java code. so on the slide, the two linesthat you see that are not commented out are what you needto do to make a call to get this hash code string. all the commented out lines arewhat you would do if you had to do that natively, so itjust kind of gives you an example of how much work wesave for you and how much prettier your code can look.
that's about it with plugins. what you do with themis up to you. here's some pro tips. when you're dealingwith android-- and obviously, you have thewhole world of the native platform at your fingertips-- you can do things like addactivities and do things with stuff that we don't normallyrequest permissions for. so in order to use that,obviously, you need those in
your android manifest. now, usually we generate one foryou, but you can take that and modify it, add whatever youneed, place that in your plugins folder, and just dropit right next to it, and we will automatically usethat one instead. that way, you don't have torebuild it every time. one thing, whether it's foryourself, whether it's for future users, customers,whatever, teammates-- you saw there was some work,annotating your function
definitions, makingthe nji calls. it's really easy to just drop inanother code file that does all that work for you and justexposes the raw apis you want your users of this-- the plug-in-- to actually see. we are dealing withmanaged code in your user game scripts. when you go down to the nativelevel, obviously you're native, and we have to marshalthat data-- any data that you
want to deal with--back and forth. so be aware that there is somepenalty costs to this, so you may have to weigh the benefitsof where you want to do the work. we're talking a lot about video today, for obvious reasons. i just want to point out somefunction calls that we're going to use later, andwhat they do, and explain a little bit.
the first is on render image. this is a call back to you, soyou can basically react to whenever all the renderingis finished for your render target. this means that everything isdone, and in this case, normally, what we would do ismichael bay up the screen like here, and add some bloom effectsand lens flares. all we need to do it forto get a replay is just copy the frame.
similarly, on audio filterread gives you a chance-- every time we read a piece ofaudio data that we're going to send into the audio processor,you get a chance to customize. you can squelch it, makeyourself sound like darth vader, whatever. again, in this case, all we'redoing is using that opportunity to copy thatdata so that way, we can encode it later. when you're dealing with aplugin that deals with your
frame time, you're going towant to be able to sync up your plugin work when you'regoing to get the frame versus when unity's rendering, becauseyou don't want to grab any frame data in the middleof when it's writing to the frame buffer. wait for end of frame is anobject you can yield to in your co-routines, and that willmake sure that when you come back and that codeexecutes, you're at the end of the current frame.
and that just helps keep allthe syncing together. i'm going to switch overto the editor now. awesome. so this is unity editor. this is where you're goingto build your game. the first thing i'm going totalk about is our project window right herein the middle. the project window is literallya mirror of what's actually on disk.
it's all your models,meshes, textures, sound files, et cetera. you can literally go and lookin on the file system, and it's going to lookexactly the same. you'll see here i have thisplugins directory. plugins is one of ourreserve words. we look for that folder and thentreat things in there as plugins, so we knowhow to build them into your final game.
and then you just say,i want an android. and as you can seehere, we have a custom android manifest. and then we have a bunch oflibs that jarek's going to talk about later. so not everything in there isgoing to end up in your final build of your game. on the left here, we have ahierarchy window, which you can see here to the rightof my screen.
this represents everythingthat's represented in our drawing screen. so that's like my wall here. i can move it up and down andbreak his game, and do all sorts of fun stuff. and here's where you build yourlayout of your game and all that stuff. so we are a component basedengine, which means that everything in our game engineis an object with components
that add behavior. so over on the right, we havethe inspector window. the inspector window shows usall the components on there. so you can see that currently,i have the wall selected. we can see that it has a mesh,it's a mesh render, it has some animation to it, ithas a box collider. and all these do is tell theengine different things, and have different widgets wherewe can customize how our engine reacts tothese objects.
if i select our main camera,we'll see that we have a bunch of scripts attached to it. now, to get custom functionalityinto your game, you create scripts. our scripts, again, aremono, so it can be c#, javascript, or boo. we ship with mono develop, soyou have everything you need to develop a game alreadyinstalled. inside here, we have--
it looks like i pickedthe audio one. that's fine. inside here, we have theonaudiofilterread, and you can see all we're doing is checkingto make sure that recording is active and thenpassing that data to a convert and write function. similarly, we have a similarfunction for video, where we have onrenderimage, where allwe do is do a little bit of logging, and we are making acopy of our frame buffer.
i mentioned earlier to haverapid iteration times. at any time, you can hit theplay button located at the top of the screen there. and we go ahead and allowyou to play your game, and test it out. so i can test out the force ofthe tennis balls that we're shooting out, et cetera,et cetera. so that's a little bit, so whenjarek's in there later and showing screenshots, youkind of know what you're
looking at. i'm going to pass it backto him now, so he can talk about, maybe. jarek wilkiewicz: ok,thank you, corey. ok, so corey took you throughthe process of creating a game with unity. and if you're a unity developer,then really, the only thing that might be newfor you is the way you integrate plugin capabilities.
if you haven't tried unityyet, i encourage you to try it out. this is a lot of fun. now, what i'll cover next is,ok, now how can we integrate video upload and video playbackcapability into the game to really take advantageof the opportunity out there that we highlighted earlier? so first, let me switchback to my demo. and the gameplay that i wasshowing you earlier when i
scored an amazing amountof points, i'm actually ready to share it. i think it's really awesome. i'm just going to hit upload. and at this point, we'reactually uploading the video to youtube. i'm actually generating anotification to show the status of the upload. while that is taking place, letme walk you through what
actually happens. so for this, we are using webmas a video container, vp8 codec, and vorbis codec forvideo and audio respectively. tomorrow morning, there's atalk about vp8, so if you would like to learn more,i highly encourage you to check it out. and we're also launching vp9. there's another talk about thenext generation of this technology.
but let's just take astep by step look at what is involved here. so we started with this awesomeunity game, and then what we actually need to get outof the game engine is the audio and the video frames. so this is what is illustratedon this diagram. we're getting the video frames,passing them onto the vp8 encoder. we're getting the audio frames,passing through the
vorbis encoder, and then use thewebm container in order to create a video file for us. once that file is created, weuse the youtube data api. in this case, we use the youtubedata api version 3 to upload the resultingvideo to youtube. the youtube dataapi is restful. it's fairly straightforwardto use. for android, we use the javaclient libraries, so you don't actually have to writeany http, html, http
json parsing code. now, in our example, we builta unity plugin in order to take care of the videoupload from unity and the video encoding. couple of hints while developingthe plugin. corey mentioned the androidmanifest is something that unity generates bydefault, since it actually creates the apk. however, if you're building anyadditional activities in
your code, or need additionalpermissions, you need to merge that into the manifestcreated by unity. so the way to do that is, firsttime you build a game in temp staging area, you willget an android manifest. and then you edit that andcopy that over into the plugins directory, andyou're all set. another thing to pay attentionto is that plugins are actually required to be built asjava android libraries, so you actually need to be carefulabout resource merge.
and really, the only thing thatis somewhat inconvenient is that if you're used to the idbase lookup up in your code directly, inside of the pluginwhile you're running in unity, you should actually use thefunctional way of looking up or the procedural way of lookingup the resources. other than that, it's justregular android development. now, corey mentioned some of themethods involved in video and audio capture, so i'm justgoing to quickly go through what we did for this demo.
so we actually attach twoscripts to the camera-- one script responsible forvideo capture, the other script responsible foraudio capture. so let's start with the audio. we are actually capturing theaudio at 24 khz frequency, so this is the raw pcm data thatyou're going to be receiving in your application. you can actually configurethat using audiosettings.outputsamplerate.
so this is something that youcan configure in unity. and for video, i'm justusing 10 frames per second right now. unity has a nifty feature. you can actually set the replayframe rate, and this is different than the actual targetframe rate at which your game runs, preciselyfor this use case. so if you want to record a videoof the game, you can actually set the captureframework to whatever makes
sense, and you do operatewithin the number of constraints that we'll touchupon a little later. so for audio recording, andcorey highlighted that when he pulled up monodevelop to showyou a snippet of the code. we use audiofilterread. so this is the callback thatis invoked by unity at the frequency that you define usingthe mechanics from the previous slide. and what's passed to you isreally raw pcm audio, and this
is what we'll pass onto the encoder. then for video-- and this is, again, somethingthat corey highlighted-- the continuation here, yieldreturn new wait for end of frame, that allows you to benotified when frame rendering is fully completed, at whichpoint you can actually turn around and read the completeframe back. and there's a number ofapproaches to this. we experimented witha few of them.
they all have differenttradeoffs. here's one. so this is the approach wherei'm actually using texture 2d to read back pixels at aresolution defined by you, so you can actually generate avideo at a lower resolution than the original. and this is, again, if youwant the video file to be relatively small, so youcan be shared from mobile devices quickly.
you can actually read the framesat smaller resolution. another approach is to use theglreadpixels, so you can read it directly from the framebuffer using the open gl glreadpixels. and this is an example ofanother implementation that i did using java. so that is actually invoked fromwithin the java plug-in to obtain the frame buffer. so that's the capture step.
once the frames are captured,so the raw audio, raw video, we need to encode it. and for that, we're actuallyusing webm and vp8 plus vorbis for audio. so let's talk a little bitabout how that is done. and again, just referring backto our reference diagram, we're actually at that stageright there in the middle where we obtain the raw frames,we are passing it down through the encoder.
so one thing that is very usefulon android is our webm engineers actually have createdjni bindings for the encoder, both the vorbis encoderas well as the vp8 encoder, and you can fetch thatfrom code.google.com. that is a jni wrapper. it has dependencies on a few ofthe native libraries, but this is what actually providesyou a really nice performance, and this is what we usedin our application. so a couple of things toconsider when you're actually
using this. so think of this as a kind ofnice object-oriented way to do a video encoding. so if you're not comfortablewriting c code, but you're an android developer, you knowjava, well, now you can encode video, and you can dothat very easily. so let's just walk through thebasic building blocks. audio encoder, there's acouple of classes-- the encoder configuration and theactual encoder, a video
recorder, again, theconfiguration, and the actual encoder, and then the webmmaxing capabilities so that you can write the compressedaudio and video frames to a container, and then endup with a single file. and here's an example of howthis looks in practice. so we start with the byte array,which is basically the pcm audio that was given to usby unity as a part of the script attached to a camera, andthen similarly for video, we read that video fromthe frame buffer.
and then we construct-- ultimately, what we want isan audio frame and encoded packet, which basicallyrepresents the encoded audio and the encoded video. and the way that is obtainedis we actually pass the raw audio bytes to the vorbisencoder, pass the raw video bytes to the vpn encoder, andthen we use the maxer to save the audio and the video frame. that takes care also ofsynchronization, so you don't
really have to do anywork in that area. and what you get backis a video file. so here's an overview,again, of the objects that we just discussed. so encoder configurations, bothfor video and audio, the actual encoders, the tracksrepresenting the audio and the video part of the webmfile, and the maxer. so this is kind of the way i seeit as an object-oriented way of doing video encoding.
and i really like it, becausei think it's very approachable. so for those of you that wouldlike to do a little bit more work in this area, but have beenintimidated by having to deal with a bunch of native codeand large libraries, this provides an abstraction thatis very easy to use. so once we have the video andaudio, the only thing remaining is theactual upload. and the way we do that is usingthe youtube data apis.
i'm using the youtubedata api version 3. so in order to access that,you need to register your project in the developerconsole. one thing to note is we areuploading the video into the user's account. therefore, we must obtain a noteof permission from the user for our application toact on his or her behalf. and the way that takes place,there's a little bit of magic involved.
it actually works very nicely. the only thing you need to knowabout is that when you register your project in thedeveloper console, you need to supply a fingerprint for thekey that you're going to be signing the application with,and you have to tell unity to use a specific key storer or aspecific key to sign your apk. and once you do that,everything just works like magic. so we know on the server sides,our api back end notes
that hey, this is yourapplication that is generating these requests, so you don'tactually have to change any code for this. and then the upload process,when i was showing you a couple of hints, don'tblock the user waiting for the upload. you can just do that as aservice, and then use a notification to indicatethe progress. and then, use resumable uploads,because if you lose
connectivity, which is, infact, very likely at [inaudible] right now, this should actuallyresume automatically. so here's one video that icreated a little earlier. let's just play it. i'm going to switch tothe mobile device. so remember the notificationthat i was popping up as a part of the video upload rightnow indicates that my video has been successfullyuploaded.
watch the video on youtube. so when i click it-- let me just select it-- i can see the gameplayright in my game. and i'm going to tell you alittle more about how that is implemented as well. but before that, just a quicknote on authorization. i mentioned to you that we useoauth 2.0 in order to upload the video on behalfof the user.
and to implement that in yourandroid application, you can use google play services. so we're using googleauthutil. makes it very straightforwardto implement the old flow. and really, the only thing youneed is to remember about the scope that you're going to berequesting from the user. so in our case, i'm usingyoutube read only, because i need that in order to checkon the upload status. and i'm using youtube upload,which is going to grant my
application the rightto upload videos. and that is reflected here bythis popup that is generated by the auth for googleplay services. and then i was just showingyou a while ago how the playback in applicationplayback works. so the reason why that is usefulis earlier today, you have learned about the newcapabilities that will be launching in the gaming area. so personally, i thinkleaderboards and achievements
are very cool, but i wanta video to prove it. so this is one use case thatyou can implement is if you have the upload capability, youcould associate or keep track of these achievements andleaders, and show the most interesting footage from yourusers in the app itself. and for that, you can use theyoutube android player api. so this is an api that welaunched a few months back, and as i was showing youearlier, it has the capability of high quality in-appvideo playback.
there's very little work thatis required in making that happen in your application. really, all you need to do isdrop in this library here, youtube android player api. this is a small client librarythat actually relies on our youtube app to do the actualvideo playback. so it's very robust. it has all the capabilities thatyou see in the youtube app, and you can make thatavailable inside of your own
app that is running inside of aunity game using the plugin capability. a couple of things to note. the youtube android player apirequires a developer key that is slightly different than thedev key that we used for the old uploads. and then we may prompt you toactually upgrade the youtube app on the user's device. so if the app is out of date,because the user hasn't
updated it, or it's not setfor automatic updates, we actually generate thiserror service version update required. typically, that doesn't happen,but this is how we can make sure that the latestcapabilities or bug fixes are actually made available to theuser when they try to use our youtube api to do videoplayback inside of your unity game. and then our transcodingpipeline takes a little while,
especially if it's highquality video, and we transcode it to a numberof different formats. youtube data api v3 hasthis nice capability. you can actually check theprocessing progress in order to find out, is this video readyto be shown on all the platforms that youtubesupports? so i highly encourage you toactually do that before you do, say, software sharing. so i was requesting g+permission in my auth flow a
few slides earlier, and the usecase there is, once i'm done with the uploading,i want to share it with the world. but share it only onceprocessing progress indicates that the video has beencompletely processed, because that means it will play on everysingle type of device that youtube supports, all thetranscodes have completed. so a few links. if you'd like to learn moreabout our apis, go to
youtube.com/dev. this is the link to therepository that hosts all the jni wrappers for webm, vp8,vorbis, and libyuv, which allows you to do rgb to yuvconversion, which is, again, a little detail that wedon't get into too much during the stock. and then we are working onpolishing up this demo app, and then we're planning to opensource it as well, so you can try it out yourself.
so we hope that this capability will excite your users. now it's 2013, so i figuredsomeone playing a tablet game would be an appropriate way toconclude this presentation. and if you guys have anyquestions, please come up to the mic. we have a few minutes left. audience: so this would do arecord as the user is playing the game, right?
so what's the impact on theperformance that you expect on the device? because gaming already consumessignificant horse power on the platform. and then if you're going torecord at the same time. jarek wilkiewicz: so thequestion is, what is the performance impact ofthis capability? actually, the way this demoapp is implimented, we are using attract where the gameplayrecording is not
happening at the sametime as the game. so i am actually doing areplay using the unity capability where you canactually tell it to render frames at certain speeds. so there is no impact whileyou play the game. however, when you're actuallyready to share it, depending on the device and depending onhow you configured it-- for example, if you want hd, if youwant high quality video, because those are parameters youcan pass to the encoder--
then the actual rendering of theframes may or may not keep up with the frame ratethat you get when you're playing the game. so there's an additional stepthat involves basically rendering the frames one by one,and at low resolution, low frame rate. right now, because i'm using apretty expensive way to fetch the frames from the framebuffer, at low resolution, i can keep up with the framerate of the game.
at high resolutions, i can't. however, that doesn't reallyimpact the gameplay, but it impacts the replay. ideally, what we would want-- and we have a couple of partnersthat do that for other platforms-- in fact, there's one in thesandbox, and one of them spoke earlier today at the-- as you see, raising his hand.
so there's other approachesto that. they don't currently workon android, so that's an alternative that can be used. hopefully we'll get to that samelevel of performance, but the impact right now is zero,because the step of rendering the gameplay is distinctfrom the play. audience: so you also said thatvp8 and going to vp9, again, does the developer kindof pick the codec there? if it's going to be vp8, vp9, oris it youtube that decides?
jarek wilkiewicz: yes. so the question is, whatdoes it mean now that we have vp9, vp8? so the good news is, youdon't really care. so frankly, as long as you giveus the content in any of the formats that we supportfor youtube ingestion, we actually turn around andtranscode it and everything imaginable that is requiredby all the devices that we support.
so we chose vp8, and we're goingto get into that in more details tomorrow is because thisis open source royalty free codec. you don't have to payanybody any money. you don't have to payany royalties. you can do whateveryou want with it. and it's a very liberallicense. and it's open source. so we find that this is a goodfit for these types of
applications. the way youtube works is, onceyou upload the video, it will actually take care oftranscoding into the formats that are supported bydifferent devices. so if a target device forplayback only supports h.264, we'll use that transcode. then our android player api,which i demonstrated earlier for playback, uses oneof these transcodes. so it really is totallytransparent to the developer.
the only thing you have to knowis all these codecs have specific requirements. for example, vp8 requires a yuv representation of the data. that's why we have libyuv alsowrapped, so you can pass the rgb data that you get from theframe wrapper, convert that to yuv, and pass that into vp8. so there's a little bit ofknowledge that is required, but i would say it'spretty minimal.
audience: so you answered partof my question with respect to the transcoding. now, for recording, i assumeit would also be h.264? because most of the devices,they don't necessarily have support for vp8 hd record. jarek wilkiewicz: yeah. so we are using a softwareencoder right now. so the way this application isbuilt is the actual vp8 codec, vorbis codec webm container isshipped as a native library
wrapped through jni and packagedin the android plugin that is integratedwith the unity. so everything required to renderthe video is included in the game itself. android has the capability toactually use the underlying hardware encoder, and thatis typically h264. though, and again, this issomething that tomorrow's session is going to go into inmore detail, the world there is also becoming moreattractive.
but the nice thing about thisapproach is there's no dependency on the specificdevice or a specific version of android, because it's just ccode that you compile using ndk, and then thejni wrappers. and it's also very small. audience: thank you. jarek wilkiewicz: allright, so we have 20 more seconds left. if there's one more question.
audience: from the beginning. i just want to ask, can werecord the normal java screen? i mean, the application screenin android, without [inaudible]? yeah, so the question is,can you record the application's screen? and i believe the answer is,for an arbitrary app, the answer is no. and typically, it's becauseof sandboxing
requirements and whatnot. so that's why this is somethingthat has to be built into the application itself. and this is really howfor pc games-- audience: what are thesemechanisms for [inaudible]? so in this session, we actuallydescribed how that can be done with unity. and the mechanism for that isunity has integration points to obtain the raw audio andthe raw video frames.
and this is what is actuallypassed into our encoder and uploaded to youtube. but it's application specific. so unity has this capability. it relies on thegl capability. so this is something thatis app specific. and if this is not clear, i'llhang out after this. we can go in more detail. all right so.
i think we're out of time. thank you very muchfor coming, and please rate our session.
0 comments:
Post a Comment