Activity
Mon
Wed
Fri
Sun
Dec
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
What is this?
Less
More

Memberships

Reality Uni

Public • 26 • $49/m

10 contributions to Reality Uni
LabVision
An update to my last post. Polished up the prototypes and excited to share the progress! And to start taking these idea's to our customers. https://youtu.be/g2Es4ThzHFQ
3
1
New comment 4d ago
LabVision
Prototype Preview
Here's a little preview of what I've been working on. Prototyping potential use case to present to clients soon. We're a step closer to getting barcode activation, but not quite ready for demo.
4
5
New comment 12d ago
Prototype Preview
1 like • Oct 13
thanks. I think we'll push something to test flight in the next month or so. Will keep you posted :-)
1 like • 12d
@Nikhil Jacob I’m still a few weeks away from a test flight version, but am preparing a more substantial video to share with our customers at the moment. I presented the prototypes at a recent event for the Vision Pro and got a great response. There were a few Apple representatives there also and they’re showing strong interest in supporting the work we’re doing, which is great. Will share the video here soon also..
open UI window from 3D object
I have a 3D model in Reality Composer Pro that I can open as a volumetric window via Swift UI code. However I'm not sure how I can then tap on that model to open a Swift View. Keen to do the following... 1. Open an independent Swift View via a 3D model interaction ( tap gesture ) 2. Open an anchored Swift View via a 3D model interaction ( where the View is connected to an entity in RCP ). I watched the diorama video ( https://www.youtube.com/watch?v=-1qdEYYqrkA&t=1239s ) but not quite making the connection. Hoping someone might have some basic code samples or projects that does the above. thanks.
1
7
New comment Oct 12
open UI window from 3D object
0 likes • Oct 12
I think I cracked it with this tutorial! https://www.youtube.com/watch?v=LVDWT_zXaog
0 likes • Oct 12
requires input target and collision components
Barcode detection in AVP
I'm using this as a start point but not getting too far. Anyone have any tips ? VNDetectBarcodesRequest | Apple Developer Documentation
2
4
New comment Sep 14
1 like • Aug 30
@Nikhil Jacob couple of interesting links to consider also https://developer.apple.com/documentation/arkit/barcodedetectionprovider?changes=latest_major https://github.com/robomex/visionOS-2-Object-Tracking-Demo
1 like • Sep 14
@Nikhil Jacob It needs enterprise access, our CEO/CTO has applied for it, waiting to hear back.
Plan for next month/updates
Hi folks, So I am in the final stages of shipping my visionOS app “Inner Chronicles” atm and must say, have learnt a lot of 3D centric stuff with this project. Speaking of which…. As soon as I’m done with this, I will start working on the “Mastering visionOS: RealityKit” course and drip out lessons here one by one. Expect a lot of 3D centric stuff there. Plan is to start work on this second week of September latest (aim is to finish the course before end of 2024) and before that I will share the course outline here, to get your input/feedback on any particular topics you’d like me to cover. So do keep an eye out for that soon!
1
4
New comment Aug 29
1 like • Aug 24
Sounds great! I'm keen to see how interactions are created between 2D UI and 3D objects. Also how a 2D video may trigger 3D objects in an immersive experience at various time points ( ie Gucci experience - balloons on screen, balloons all around ). Some area's I'm exploring.. real world object tracking, barcode / QR code activations. Simple image galleries, Carousel components ( foreground & background ). Simple styling and animation, responsive elements etc. Publishing to testflight. Looking forward to seeing 'Inner Chronicles' too!
1-10 of 10
Melissa Randall
3
38points to level up
@melissa-randall-7541
Creative Director at Equinox Ventures / Creative Technologist / UX / UI / XR Designer

Active 1d ago
Joined Aug 4, 2024
Melbourne Australia
powered by