Episode 159

Screen Shot 2016-02-07 at 2.11.17 PM

James and Stefan discuss over a dozen topics in this week’s episode, including:

Note: You can subscribe via RSS or iTunes.

[Episode 159, 37 minutes and 09 seconds, 26.7 MB]

Episode 159

15 thoughts on “Episode 159

  1. rcadden says:

    Avid Google Play Music All Access user/subscriber here. Absolutely no plans to stop using PocketCasts for podcasts. Podcasts will be tacked on to GPMAA and treated like music, most likely. PocketCasts is specifically built for podcasts.

    1. Stefan Constantinescu says:

      I love, love, love Pocketcasts, but their audio engine is totally busted on iOS, so I use Overcast instead. I’ve emailed the developers about this issue, and they said they’re working on a totally new version that will (hopefully) come out this spring.

      But to address your main comment, I’m in full agreement, Google is likely going to treat podcasts as a checkbox feature that they have to add because everyone else is.

      Granted, for the podcast community, it doesn’t hurt to have Google finally include a podcast player/catcher in Android out of the box versus making users have to go out to the Play Store and find an app.

      Apple’s iOS podcast app is basic AF, but it’s there, and that counts for something.

  2. Michael media says:

    Hi guys, I’m on Play Music and find it to be the best all-you-can-eat streaming service in addition to being coupled with YouTube Red in the blanket $9.99 monthly price. PocketCasts is my podcast app of choice and I’d probably continue using it over podcasts in Google Play strictly because I paied for it on both Android and IOS. Love the show, continue to keep each other in check.

    1. I did not know that YouTube Red was included in the Play Music monthly fee. That’s pretty awesome. Now we just need YTR to get some decent content ๐Ÿ˜‰

      And thanks for the feedback; it’s always appreciated.

  3. kenibarwick says:

    Guys a handset recommendation…

    I’m a happy Windows Phone user, insert sniggers and usual why bother comments here.

    Now we have that over with, I’d like to experience the “awesomeness” of Android as you both discuss and great length every week.

    Which device should I get, baring in mind I won’t using it as a phone, nor using it take photos.



    1. Stefan Constantinescu says:

      This is a great question, and I’m debating whether or not I should answer it here or wait until James and I record next week’s episode.

      1. kenibarwick says:

        Thanks for your suggestion on the next episode, I’ve taken you up on your suggestion… Should be with me by the weekend… Just don’t tell the misses ๐Ÿ˜‰

  4. You wondered on the podcast about dual lens cameras, the advantages, and talked about reading up on them. Maybe I can give you a little primer – I work with sensing the world and dual lens cameras (amongst other things) – I can see three stereo pairs attached to robots where I sit right now. I could talk for hours about this stuff (if you ever want a far-too-geeky guest host ๐Ÿ˜‰ ). Anyway, what might apple/someone do with two lenses on a smartphone? Some idle informed speculation:

    – Better quality images.
    You could improve resolution, or decrease noise, as bluntly put, you’re sampling more of the world. Think of it like this, each pixel on a sensor represents an estimate as to what colour that part of the world actually is. It’s not perfect, and so image quality isn’t perfect. With two sensors, you get two estimates you could combine to one better one.

    -“Fake” images, aka computational photography.
    With two lenses, giving slightly different views of the world (focal lengths, depth of field etc, focus points) you can think about how given two different photos you could reconstruct any “intermediate” picture. Think a good way to fake DSLR like shallow depth of field, or re-focus after the event in software.

    -Better image stabilisation
    With two cameras, you get something special, and that’s 3D information about the scene. But it’s more than that, if you can track the scene in 3D you actually know _how the camera moves_ relative to the scene. Now doing this is realtime is hard, without cunning GPU computation or dedicated hardware. But guess what? Apple design their own chips. They could have dedicated motion estimation/stereo processing hardware. This would be a competitive advantage no-one else would have. With better estimation of the camera movement, you can stabilise the image better in software…but also…

    So at the moment, VR on phones tracks the _angle_ you look with the accelerometer/gyros, but you can’t walk around in VR. On a rift, you can move your head around a bit as there’s a webcam tracking you. On a HTC Vive you can walk around the room. But if you have stereo cameras on the phone (headset) AND dedicated hardware to process it fast enough to track the headset pose in realtime, now you have something really special. You could have a VR headset with no wires, that allows you to walk around in VR. Now that’s something I believe only Apple could do. If I was Apple looking to move into VR, and thinking what it might look like done “right” it’d be this. Pop your phone in the headset, VR you can walk around in, no wires.

    Anyways, I could speculate all day, but that’s enough to get you thinking….

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s