John ๐Ÿ‘๐Ÿ’š๐Ÿ”ฎโš•โšกโ˜€๏ธโ™ป๏ธ๐ŸŒณ๐Ÿ”ธ(@johnhanacek) 's Twitter Profileg
John ๐Ÿ‘๐Ÿ’š๐Ÿ”ฎโš•โšกโ˜€๏ธโ™ป๏ธ๐ŸŒณ๐Ÿ”ธ

@johnhanacek

#XR Interaction Designer @nanome_inc โ—‡ He/Him/They/Xyr โ—‡ Writer #RegenerativeFutures Artist โ—‡ Infinite Player โ—‡ @GlobalCoLabNet @Avatar_MEDIC @CanCoverIt

ID:979389758

linkhttps://jhanacek.net calendar_today29-11-2012 23:48:40

11,6K Tweets

1,7K Followers

5,0K Following

John ๐Ÿ‘๐Ÿ’š๐Ÿ”ฎโš•โšกโ˜€๏ธโ™ป๏ธ๐ŸŒณ๐Ÿ”ธ(@johnhanacek) 's Twitter Profile Photo

So this is ridiculous but also the future.
Design perspective: XR headsets give humans a way to visit the simulated training grounds, and direct in the 'real world' in the same context. Please understand and if anyone is working on robots x XR shared HI & perception let me know;)

account_circle
John ๐Ÿ‘๐Ÿ’š๐Ÿ”ฎโš•โšกโ˜€๏ธโ™ป๏ธ๐ŸŒณ๐Ÿ”ธ(@johnhanacek) 's Twitter Profile Photo

I always felt this was the 'first' use of all the XR tracking tech stack: realtime VFX.
So instead of filming nothing and adding later, film actual composited tracked virtual elements and then better render it later. Cinematography for mixed reality :)

account_circle
John ๐Ÿ‘๐Ÿ’š๐Ÿ”ฎโš•โšกโ˜€๏ธโ™ป๏ธ๐ŸŒณ๐Ÿ”ธ(@johnhanacek) 's Twitter Profile Photo

Sparkles here, If this can get more refined and customized through experience for suggestions consider what is even being rendered at all, static graphics or entire computer? Why not both :) jhanacek.net/art-math-math-โ€ฆ

account_circle
John ๐Ÿ‘๐Ÿ’š๐Ÿ”ฎโš•โšกโ˜€๏ธโ™ป๏ธ๐ŸŒณ๐Ÿ”ธ(@johnhanacek) 's Twitter Profile Photo

โ€œDigital computers could easily manipulate all the things that humans had been counting, especially their precious and beloved money. Yet when it came time to use them to explore the structure of the world, the computers were slow, far below real time.โ€

medium.com/@johnhanacek/iโ€ฆ

account_circle
John ๐Ÿ‘๐Ÿ’š๐Ÿ”ฎโš•โšกโ˜€๏ธโ™ป๏ธ๐ŸŒณ๐Ÿ”ธ(@johnhanacek) 's Twitter Profile Photo

Real look at genAI in production; manual still very required :)
Like documentary almostโ€œFor the 1:30 of footage in the film, they generated โ€˜hundreds of generations at 10 to 20 seconds a piece. probably 300:1 in terms of the amount of source material to what ended up in final.โ€™โ€

account_circle
John ๐Ÿ‘๐Ÿ’š๐Ÿ”ฎโš•โšกโ˜€๏ธโ™ป๏ธ๐ŸŒณ๐Ÿ”ธ(@johnhanacek) 's Twitter Profile Photo

Like folks seriously, a revolution is going on in rendering and computer graphics right now. If you know about meshes and point clouds, you know why this new method is so magical and will change cinema and video games forever imo

account_circle
John ๐Ÿ‘๐Ÿ’š๐Ÿ”ฎโš•โšกโ˜€๏ธโ™ป๏ธ๐ŸŒณ๐Ÿ”ธ(@johnhanacek) 's Twitter Profile Photo

After crashing my car a few weeks ago Iโ€™m still shaken up about driving. We are so gnarly in USA with how risky we are every day! Stay safe everyone.

PS if you want to help me (since insurance didnโ€™t really cover all) buy my stories & tell a friend! jhanacek.net/stories-3

account_circle
John ๐Ÿ‘๐Ÿ’š๐Ÿ”ฎโš•โšกโ˜€๏ธโ™ป๏ธ๐ŸŒณ๐Ÿ”ธ(@johnhanacek) 's Twitter Profile Photo

There is no space between things, light waves are everywhere, edge btwn I & world is unclear.
There is only space between things, light photons must traverse distance and interact with atomic cloud puffs fluffed together to make a seemingly solid world.
youtu.be/rbVXu4GRSUs?siโ€ฆ

account_circle
John ๐Ÿ‘๐Ÿ’š๐Ÿ”ฎโš•โšกโ˜€๏ธโ™ป๏ธ๐ŸŒณ๐Ÿ”ธ(@johnhanacek) 's Twitter Profile Photo

And appreciate how Figmin has been this multi-modal, multi-platform through line of delight that recreates a universe of affordances from scratch just to show us a future that headset vendors are just now learning how to provide ;)

account_circle