Coming from a senior Oculus lead, the most interesting thing about this write up for me, is what it lacks: it says almost nothing about the software stack / operating system. Still 100% talking about hardware at the bottom and end user applications at the other end. But there is no discussion of the platform which to me is actually the highest value proposition Apple is bringing here.
In short: Apple has made a fully realized spatial operating system, while Meta has made an app launcher for immersive Unity/Unreal apps for vanilla Android. You can get away with an app launcher when all you want to support is fully immersive apps that don't talk to each other. But that fails completely if you are trying to build a true operating system.
Think about what has to exist, to say, intelligently copy and paste parts of a 3D object made by one application into a 3D object made by another, the same way you would copy a flat image from photoshop into a Word document. The operating system has to truly understand 3D concepts internally. Meta is building these features but it is stuck in a really weird space trying to wedge them in between Android underneath and Unity/Unreal at the application layer. Apple has had the advantage of green field engineering it exactly how they want it to be from the ground up.
For me personally, it's definitely the platform. Requiring a Meta / Facebook account for already-purchased Oculuses, retroactively bricking devices and deleting software which was bought before that requirement, has put Oculus firmly in the "hardware I will never consider in my life" camp.
It's an incredible amount of goodwill to burn from a company with so little to spare, and I'm surprised it hasn't come up yet in this thread or in the blogpost. Meta has fundamental trustability issues.
That isn’t really what the parent is talking about at all…
If your issue is with the device requiring connection with an external account, Vision Pro requires an AppleID which will tie it to way more of your digital things than a Facebook login.
The Oculus was bought by Facebook from a startup and the Facebook/Meta account requirement was imposed later. This history is a reason I would not buy an Oculus.
The AVP does not have this history, and as far as I know, Apple hasn't retroactively required an Apple account on hardware sold without the requirement. (I could be wrong here, and Apple certainly is not a blameless tech company.)
Like the parent comment, I agree that this blog post completely skirts the issues most important to me. Like the parent comment, I agree that the platform is the distinction, but for different reasons. If I owned an Oculus, I would consider it to have been bricked.
Apple’s business model isn’t selling your personal information. In fact they go out of their way to protect your personal info. Requiring an AppleID is significantly less concerning than requiring a Facebook account.
how exactly do you know this and why come on here and state it as a fact?
its a dubious claim at best, and if it were true would require asking the elephant sized question in the room - why collect my data if you arent going to use/sell it anyway? is collecting ny data just a fun internal project at Apple?
What data are they collecting about you, and who is it being sold to?
With Facebook its an easy answer - hobbies, intererests and habbits to use for targeted advertising.
That logic doesn't quite work for Apple given they arent showing ads on your OS, only in the app store.
It's pretty obvious Apple's revenue focus is on hardware and services, not on ads.
This is neither obvious nor accurate.
Apple's advertising business is estimated to be closing in on $10Bn yearly, while their browser pre-load deal with Google brings them in even more than that, and data-sharing agreements are assumed to be part of the deal.
The fallacy that Apple can be trusted with your data because it makes more money elsewhere is incredibly naive and unilateral.
Just to put this in perspective, Apple's yearly revenue is closing around 400B. 10B is in the "other" category in the revenue chart.
The ads are still only in the App store, where you need to specifically go and look for... apps. And you'll get ads for... apps.
You're not getting personalised ads for incontinence products because Apple doesn't know nor care that you have frequently visited a doctor specialising in said issues. Your phone does and Siri will suggest you the exact place as a destination if you start up maps in your CarPlay view.
Google and FB _will_ serve you said ads and with eery accuracy and speed. (I've visited a niche product store on my desktop browser, opened Instagram 5 minutes later and received ads for that exact niche)
> data-sharing agreements are assumed to be part of the deal
You're making a massive assumption there. Unless you can back that up it is what it is - fud.
You'll be fully aware that as part of Google's anti-trust case Apple was only willing to enter into data sharing with Google if they provided data to Apple, which they refused to do.
So again, we're still in a position of the only place Apple is focused on ads is the app store, so I'll ask again. What data are they collecting about you, and who is it being sold to?
> its a dubious claim at best
Wow I’ll bite, proof by contradiction. There isn’t a single credible article claiming apple sells their customers information
I'd personally just like to see things even more separate. I wouldn't want what I do on my VR headset tied to anything else.
Sorta like how I might have a Nintendo account (or whatever they have now) for games on a Switch. It's just about my gaming activity on that one platform, and that's it.
Tying a headset to a Meta/Facebook account is just too much: I don't want my Oculus activity tied to my social media.
I agree that the Apple situation is better, but I wouldn't want my Vision Pro activity tied to iPhone, Mac, Apple TV, etc. activity.
And I get it from the "building an integrated ecosystem" perspective, and don't really begrudge them their desire to make something like that, but I'm just generally tired of being a part of some company's ecosystem.
How about not requiring accounts for all these different gadgets at all. What's wrong with being able to run whatever software you want on your VR headset or gaming tablet.
https://digiday.com/media-buying/apples-expanding-ad-ambitio...
>In the meantime, Apple continues to work with ad tech vendors it trusts — or rather, those with stated policies it approves of — particularly when it comes to a cornerstone of the iPhone maker’s brand: user privacy.
>However, a key question remains: how will Apple ensure user privacy as its ad ambitions expose the iOS ecosystem to a sector of the media landscape with a chequered record when it comes to a cornerstone of its brand promise?
>Earlier this year, it unveiled a tool it will use to police user privacy in the guise of Privacy Manifests (see video above), a measure that many interpreted as Apple’s attempt to (finally) stamp out illicit user-tracking, a.k.a. fingerprinting.
Apple has a vested interest in user privacy and talks about it constantly. Facebook has an interest in selling every piece of information they have about you to the highest bidder and has talked about how stupid users are to give them personal information.
They are not the same.
Apple's "privacy" is really privacy from people that are not Apple. Apple has access to location logs via maps and location services, the contents of your photos, iMessage contents, the history of every app usage, etc as the default settings set for most things via iCloud backup, which the vast majority of users leave on. They were almost going to deploy on device scanning that you couldn't opt out of with the few photos that don't end up on iCloud.
All of their devices don't work if they don't constantly phone home to Apple. For the devices to be anywhere near useful, you need an apple id which requires KYC payment methods attached for them or a KYC phone number.
> Apple's "privacy" is really privacy from people that are not Apple.
Sort of, but misleading…
> Apple has access to location logs via maps and location services
There are published policies about how this data gets aggregated+anonymized and then used. Care is taken to ensure data is not linked to individuals.
> The contents of your photos, iMessage contents, the history of every app usage, etc as the default settings set for most things via iCloud backup
Most people want these things backed up. Apple doesn’t just dive through data. Anything that even approaches the description of dealing with user data is carefully vetted. A big difference between Apple and other large tech companies is the internal boundaries for access to any data. It’s strict and limited by design. Apple has fought back against law enforcement for access to personal data or technology to allow governments to carte blanche access devices.
> They were almost going to deploy on device scanning that you couldn't opt out of with the few photos that don't end up on iCloud.
You are referencing the CSAM scanning for known child pornography based on international databases that was tuned for highly unlikely false-positive rates with a small group of reviewers to further reduce any chance of false-positives? Yeah, total travesty… wouldn’t want to do anything about THAT problem. (/s)
> Apple has a vested interest in
Shareholder profit.
> and talks about it constantly.
Must be true then. Definitely not marketing.
> Facebook has an interest in selling every piece of information they have about you to the highest bidder
Nothing you've said suggests any different of Apple.
You're both right. Apple believes (as do many) that one way to achieve great shareholder profit is to differentiate yourself and perhaps get some pricing power is by truly prioritizing consumer privacy. They're both true.
>Shareholder profit.
This is a tired trope. They have a clear interest in user privacy, as evidenced by their actions. You're either just trolling or being intentionally ignorant to the state of the market if you're claiming their only focus is "shareholder profit". Apple isn't Boeing.
>Must be true then. Definitely not marketing.
I mean, there are countless examples. From the default encryption in messages, to the ability to double encrypt icloud backups, to a literal lockdown mode in IOS to protect against nation state actors.
https://support.apple.com/en-us/105120
>Nothing you've said suggests any different of Apple.
You've provided absolutely nothing of substance beyond a link where Apple literally states they have hard requirements around user privacy for any advertising partners.
I'm done engaging in the conversation unless you've got something of substance to provide. The low effort one liners don't really have a place on HN.
If you pay attention to the marketing, Google, and Samsung, and Qualcomm, and Sony, and Microsoft make most if not all the same claims. I haven't seen one that operates in a way that can prove it. All have some level of custom silicon involved. All could provide hardware documentation, source code, and installable or transparent keying. Barring the legal agreements between them, of course. I commend Apple for their amazing effort to uniquely ID each individual sensor and storage device in the world and tie it permanently to phone's unique ID, but again, it would be nice if the details of how that worked were published such that folks like Louis Rossmann could repair folks broken phones and laptops.
> The low effort one liners don't really have a place on HN.
Maybe you just didn't think about them long enough.
this.
One of Apple’s biggest selling points is privacy.
Been using the AVP every day for about 3 hours and its truly stunning for a version 1 device. I can’t imagine what version 5 will be like.
I’ve also used the oculus but returned it.
Selling points, not actual privacy. The fact that you have to identify yourself (and link that identity to your hardware serial number) to install apps on your own device is the opposite of privacy.
They are in PRISM just like any other, stop believing this lie.
PRISM is just NSA's management systems for sending lawful (kinda) information requests for specific accounts under FISA act to the companies. If data is not leaving your device (e.g not synced into iCloud) then they can't get it.
Also if you opt-in for apple's advanced data encryption[0] they can't even get that because your data is fully e2e encrypted and the key is only stored on your phone. Which is probably why FBI had to sue Apple to get data on Bernardino terries and then worked around by hacking into their devices. Show me another big tech company that does this.
[0] - https://support.apple.com/en-us/102651
Don't believe their lies. They let agencies directly into their supply line. Nobody is above the law and they must comply like any other.
The Bernardino case was a stunt to get people believing this lie. All the other cases in which Apple complies are not made publi, though.
Corporations are not your friend.
Right and you know this because…?
Snowden.
i think your timeline might be a bit off here...
Thanks for your comment. I love just a few things on my Quest 2, and several times a week I take ten minute breaks for ping pong, something meditative, tai chi, etc.
You reminded me of the negative aspects of the Meta/Facebook corporate mass, and they should clean up their act in privacy, etc. for VR in the same way they have basically purchased good will in the AI community for releasing LLM model weights.
Apologies for going off topic, but Apple similarly really needs to trade a little profit for buying themselves a better “look” because they are looking a little tarnished also.
100% this. I paid the increasingly common "privacy and control premium" for a Valve Index (which I'm very happy with) to avoid the entanglements of borrowing a headset from Meta for a large, up front, non-refundable fee.
Valve makes great unlocked hardware. I don't see the same argument working with Apple, however. Can't even upgrade an SSD in a recent mac not to mention individually cryptographically signed components like cameras and touch pads which can't be replaced without a visit to an Apple-certified repair person. Renting hardware indeed.
You're comparing a $300 product from a company that profits on analyzing their customers to a $3500 product from a hardware company.
This is not a fair comparison. They're motivated differently.
Furthermore, the "anti-account" viewpoint is making a privacy issue out of a pinch or friction point. Accounts are required for both devices. If you bought a device which allowed you to buy apps, the experience would be horrible without an account. If most people are willing and it's a better experience, it makes sense to force everyone into the same rails to reduce implementation cost. If it increases revenue, there's yet another reason to do it. It's ridiculous to be in an ideological minority and expect a company to bend to that when it's not in their best interest.
While I prefer Apple products because <yada yada>, Meta and Apple are doing the same thing here. The only difference is that Apple has higher current trustworthiness. This is also the reason they can release a $3500 headset.
Why isn't that a fair comparison? If that's an important factor in their purchase decision that's completely fair. You can compare whatever you want when you're evaluating subjective criteria for a purchase decision - and the socioeconomic rationale behind the motivations leading to the decisions the companies made is interesting but not relevant to the comparison at decision point.
I’ve had a Quest 1 for years, it always required an Oculus account, and since rebranding as Meta it now requires a Meta account, which my Oculus account was converted into.
It’s not bricked and hasn’t deleted my software, I’m curious what exactly you’re referring to with that.
> Requiring a Meta / Facebook account for already-purchased Oculuses, retroactively bricking devices and deleting software which was bought before that requirement
You always needed an Oculus account and they didn't brick anything. You did have to migrate from an Oculus to meta account but a Facebook account was never required on a quest 1 (or 3). Is a meta account really that different from an Oculus account?
The quest 1 has been deprecated yes but not bricked.
Isn't this website run by people that school people to do just that ?
[dead]
This has been my main complaint with Oculus all the way since the Rift days. At that point I assumed it was forthcoming within a few years yet here we are 8 years later and somehow it's not all that different. I don't understand how Oculus/Meta isn't drastically ahead at this point on software.
Actually, MSFT made the same blunder when it came to HoloLens. Well .. they did start to build some of the core spatial context (and had a fabulous headstart). But somewhere along the way, they yielded to Unity/Unreal. This was mind boggling to me as giving away the keys to the platform to another party was literally the founding story of Microsoft (with IBM having made the blunder). I wonder if engineering leadership recalls history when making such strategic goofs.
Layman's take: All of them are afraid of making the system that fulfills the promise nominally, but that lacks some key component or is on hardware that doesn't get adopted, only to have a competitor swoop in, clone that system with the necessary fixes, and essentially do what Apple did with MP3 players and smartphones. They're all trying to establish market dominance BEFORE giving us a reason to use the devices (bass ackwards) - and are even happy to see the market collapse, if it meant that, simply, no one cracked that particular nut.
Apple, Meta, Microsoft like how things are right now. These pushes are much-hyped, but they're made less out of real passion for the promise and more desperation to avoid being left behind.
What’s more insulting after the announcement of the vaporware known as “infinite office” is meta’s total lack of attention on their PC software. The work related features of Quest are near non-existent if it weren’t for 3rd parties
I totally agree. While Apple has a north star with this device (or looks like it does), Meta's endeavors always seemed like diversification. Meta seems to be looking for the north star. Apple just pointed it out, so now everyone is going to head that way.
Well it's easy to understand why. How could they build an MR ecosystem when their latest device is just barely MR?
They can only just now move towards MR with the Quest 3 and really it'll need another generation to be MR native.
They have a good relationship with developers and focused on what their current hardware is capable of, which is running one VR app. They spent the last 8 years on that use case and I think that was the right choice given the hardware realities at the time.
They are drastically ahead. They have VR games, which are the only real reason to own a VR headset at the moment (and for at least 5 years).
> VR games, which are the only real reason to own a VR headset
Because the rest of the experience is so unpolished.
I have a Meta Quest 3 and overall it doesn't exactly feel like they invested tens of billions into that ecosystem. The headset's UI is basically a 2D desktop with taskbar and app launcher covering a small fraction of the field of view, including some buttons that are so small it's tricky to aim at them with the controllers. The Oculus desktop client fails to recognize it via USB and the official remote desktop app is still in Beta while Steam lets me play games or use the desktop remotely with two button presses. To this day I have not managed to just copy files directly onto the device, no USB connection (other than to Steam) works. Only some semi-reliable wifi transfer from a third-party application worked but that required enabling developer mode.
On top of that they decided to ship it with a head strap that never fits well and gets painful within 30 minutes, and then made it unnecessarily complicated to swap. Yes, of course people aren't going to do more on that thing than play a few rounds of Beat Saber, because many simply don't want to jump through hoops like that. I think it's a great device overall but some things are just so...unnecessary.
Apple not focusing on games might be a good thing because it means they can't just rely on games for free sales numbers.
> To this day I have not managed to just copy files directly onto the device, no USB connection (other than to Steam) works. Only some semi-reliable wifi transfer from a third-party application worked but that required enabling developer mode.
To echo this today I wanted to watch something on my vision pro so I was on the tv app on my phone, saw a movie I wanted to watch, and then after a good amount of time moved over to my vision pro.
Being the scatter brain that I am I forgot what the movie was, unlocked my phone and the movie listing view was there. In my head I was like “damn wish I could share this page over to my vision pro like I do for my ipad”
And that’s exactly what I did with Airdrop. The already existing way to share anything between apple devices. I would not be surprised if universal clipboard works as well.
You fail to realize that having such headset 8h a day is not the holy grail for most people, I'd never work in such way. Horrible for your eyes and overall health in many ways we already know and many that will be discovered after this betatesting runs for decade+.
Entertainment maybe, but definitely no work- like it or not, outside few tech bubbles this is how world sees VR and its not changing anytime soon. Still, to sporty outdoorsy people this is kids toy (that shouldn't ever be on kid head), reality is and will be always better and healthier.
There's a real chicken-and-egg recursion there: VR headsets are only good for VR games because the only thing made for VR headsets is VR games because VR headsets are only good for VR games because...
It’s not chicken and egg, it’s simply the reality about the hardware. Even Apple, with all their resources and a $3500 price tag, could only make a mediocre passthrough on a very heavy headset. The hardware isn’t ready for AR yet.
Games are where it’s at for the foreseeable future. Games don’t need passthrough and they don’t need especially high resolution.
Look at how much of the vision pro is about giving people a connection to the real world while they are using the device. Games don’t need that, people want to be immersed while they are playing a game.
> Even Apple, with all their resources and a $3500 price tag, could only make a mediocre passthrough on a very heavy headset. The hardware isn’t ready for AR yet.
Hugo disagrees:
> thanks to a high-fidelity passthrough (“mixed reality”) experience with very low latency, excellent distortion correction (much better than Quest 3), and sufficiently high resolution that allows you to even see your phone/computer screen through the passthrough cameras (i.e. without taking your headset off).
Even though there are major gaps left to be filled in future versions of the Vision Pro hardware (which I’ll get into later), this level of connection with the real world — or “presence” as VR folks like to call it — is something that no other VR headset has ever come even close to delivering and so far was only remotely possible with AR headsets (ex: HoloLens and Magic Leap) which feature physically transparent displays but have their own significant limitations in many other areas.
I admit I don’t own one of these things but reviewers seem to be unanimous that the passthrough on the Vision Pro is both the best of any headset on the market, yet also very mediocre compared to seeing things through your own eyes, especially in low light.
Given that it’s designed to be used indoors, poor low light performance is a big problem.
There’s a latency/acuity tradeoff whereby the more post-processing Apple applies to improve acuity, the worse the latency and more nausea they create. It’s going to require a lot more research into hardware post-processing.
Seems like the best passthrough was a fairly easy goal to achieve since nobody else was even really trying. Heck, the Quest applies quality degrading filters to passthrough video (add noise, remove chroma) to discourage using it.
Filters to discourage use? Do you have a source for this? Surely they are just low-res, infrared cameras.
They probably mean the quest 3 which has RGB cams unlike the prior quests 1 and 2. I also disagree it would have been artificially muddled to discourage usage. If that were the case they'd not have presented it so proudly. It's just the kind of cam setup that $500 buys (in fact it probably is a bit subsidised)
But VR only needed DK1 to take off.
They are drastically ahead in the VR gaming space. But the potential VR/AR market is hundreds fold bigger than that.
The potential market in 10 years. Apple has jumped the gun here. This is their Apple Newton moment for AR.
That's probably a fair argument at the pace of innovation pre-AVP. Depending on how quickly they iterate in this (and as the article says, push developers), they may be a self-fulfilling prophecy to significantly reduce the time until this market exists.
I hope that seeing where the Newton could’ve gone gives them the confidence to continue with the AVP. A few iterations could really show a great product both in terms of quality and practicality.
I had a Newton and loved it, and eventually tried a few Palm devices but nothing ever quite hit like the Newton for me, a real shame they dropped it imo.
effectively though, they've created their own mini-innovator's dilemma. They can't do anything to alienate those users but they might have to if they want to stay competitive in the long run.
Innovator's dilemma is a great problem to have if you are dominating a profitable industry already. But it's a terrible problem to have if you are barely hitting break even or even losing money. Then you really can't afford to go backwards first to go forwards later.
Are we gonna ignore the fact that VR pornography exists?
Also I'm pretty sure those VR games that are ran on a computer connected to a headset could display on any headset, Apple or Oculus. Cursory search reveals people have already been getting SteamVR to work on the VisionPro.
Running stuff directly on the headsets is neat, but there's no headset on the market with enough power to match what you can have when plugging them into a computer.
> Are we gonna ignore the fact that VR pornography exists?
On the Apple Vision Pro?
The point is that VR games are not "the only real reason".
[flagged]
[dead]
Through Safari, or forthcoming VR video player apps (Apple doesn't censor generic utilities).
I'm not saying they should (in fact I think the Valve approach is better), but Apple does seem to be strongly against porn. Why wouldn't or shouldn't they censor generic utilities? If it's keeping users safe in apps, wouldn't it be keeping users even safer in a browser or other content browsing app?
You’re suggesting Apple would block porn all together on their platform. You can go to any XXX site right now, find VR videos, and play them in your mobile browser. Through on a Cardboard or even a crappy Polaroid “VR” phone case and you’re set. There’s no way Apple would say “well, on the Vision Pro, we will actively block adult websites”.
The players are already there. Those who want VR porn have been able to view it on AVP since days after release.
VR porn is largely just WebXR on webpages or SBS VR 180 videos. WebXR has been available on the Vision Pro since day 1 if you enabled it in the Safari advanced options and there are now multiple video apps that can handle SBS VR 180 playback.
All the news about it not being possible on the AVP was largely a bunch of hyperbole, misunderstandings, and misinformation.
To me the interesting bit is that an even a VR executive a decade plus into working in the field doesn’t find this device compelling enough to own it.
I get that the thesis is that this version is the devkit etc, but viable consumer product status (read: enough adoption for the device to be profitable) seems very far away
> viable consumer product status (read: enough adoption for the device to be profitable) seems very far away
He mentions a few short term use cases for the current hardware.
For example: Productivity on the go (A laptop with the headset for multiple virtual displays) and Live Sports.
> Apple Immersive on Vision Pro is a transformative experience in terms of video quality and its ability to deliver a real sense of presence. Watching a game in high-resolution VR has the potential to be legitimately better than a regular 4K TV broadcast by enabling hardcore fans to feel much closer to the action
I've heard the sport idea thrown around a few times, but I'm not sure I buy it.
If you go to a sports event you are mostly buying the experience of being there, the energy of the crowd, the cheering all that stuff. The actual experience of seeing what's happening is not really better is it? That's why the stadiums have screens in them.
Replicating that experience at home is more like getting people around to watch a game together.
> I'm not sure I buy it.
"legitimately better than a regular 4K TV broadcast by enabling hardcore fans to feel much closer to the action" sounds like it offers something new.
People who are sports enthusiasts have a proven willingness to drop thousands of dollars on large screen televisions, streaming services like NFL Red Zone, or thousand dollar Superbowl tickets, so the potential for sales is there.
I think the question is, how many people watch sports to be close to the action, and how many watch sports to be close to their friends?
Are you saying that there is no such thing as people who watch sports at home by themselves?
Because that's not remotely true.
In addition, Apple already has it's shareplay tech that allows networked users to watch shared video, listen to shared audio, or video game together.
My argument is that the experience of seeing sport from a specific seat is inferior to watching it multi camera with huge zoom lenses. The thing that draws you to the stadium is the sense of being in a crowds.
The technology to do this has been around for a while to has anyone tries. I'd certainly be curious to give it a go. I imagine there are some technical problems too, like if your team scores and you jump in the air and your view point stays still.
This did get me thinking if any sport might be better viewed in VR, and maybe games like pool, snooker, chess. Where you see the whole thing from one vantage point, and the scale is such that the 3d of it all would be meaningful.
The argument in favor of the tech Apple is using in this essay sounds pretty compelling.
> The NextVR acquisition is what led to the incredible Apple Immersive video format, which enables capture of 3D video in 180 degrees in 8K resolution at 90 frames per second, an absolute juggernaut format with 8 times the number of pixels of a regular 4K video. The best way to think of the new Apple Immersive video format is kind of like a new IMAX-3D, but the real magic is the fact that it’s projected inside an imaginary 180-degree sphere (horizontally and vertically) that takes over your entire field of view.
Vision Pro is the first VR headset that enables playback of 180-degree 3D video at what feels to the eyes like 4K quality.
"IMAX-3D" sounds much more compelling than watching a flat image on a television.
That's definitely impressive tech, and I'm sure the experience inside a headset is pretty incredible. What I'm saying is that it is not a good match for live sport.
When you watch sports they have multiple cameras all over the place. Fixed cameras with long lenses, cameras that zip over the pitch, cameras on blimps, slow mo cameras and so on.
These cameras are so much better for enjoying sport that they put giant screens in the stadium so you can see what happened after a goal is scored.
That experience is never going to work in a VR system (beyond VR as a way to have a big screen available) because if you kept shifting the position and focus you'll make everyone very motion sick.
> When you watch sports they have multiple cameras all over the place.
Why assume the viewer doesn't have a choice of viewpoint locations they can decide to switch between?
I can certainly see a usecase for that and it's not sports (though I guess you could call it that lol).
But prudish Apple will surely block that from happening. I don't really get why. It's a valid request and one where the technology really shines. I use similar content on the quest 3 and it's great but it would be so much better on something like the vision pro.
Yup, there are hardcore fans but sports are largely a social event.
> For example: Productivity on the go (A laptop with the headset for multiple virtual displays) and Live Sports.
Except almost universally, people talk about the screen display being “not great” for extended use as a screen replacement with dramatically lower effective resolution and blurring…
> people talk about the screen display being “not great”
That's not what the reviews I have read had to say.
> The Vision Pro can produce a virtual external display for any modern Mac... The virtual display feels responsive and works with connected keyboard or mouse peripherals. The text is highly readable.
I don’t have any complaints about how the virtual display itself works—it’s great.
https://arstechnica.com/gadgets/2024/03/i-worked-exclusively...
It's the crappy version 1. Just like iphone and ipad v1. They sucked.
It's very obviously better to wait a little longer for a future version.
Not sure I agree. When I first saw the 1st gen iPhone I was so impressed with it, I went out and got one a few days later. This is before the App Store. Yes compared to today it might “suck” compared to the latest version, but the first iPhone was super compelling by itself at the time and started selling very well
Yeah I used my iPhone 1 for 4 years until I moved to the phone 4 (a year after it was released because I couldn't afford it new). It was a great device, only let down by its ridiculously slow data connection.
Yeah if the first version doesn't take off that's generally not a good sign. 1st Iphone did extremely well.
> 1st Iphone did extremely well
Citation needed. The 1st gen iPhone sold 6 million units over two years. The Nokia N95 (not a super mainstream device, but in a similarish price category) sold 10M. Other Nokia phones of the time period sold 100+ million devices. BlackBerry, LG, and Sony/Ericcson was in the tens of millions per device model.
Let’s not forget:
1. The iPhone didn’t support 3G, which essentially all other phones of a similar price point had
2. Was only available for AT&T customers in the US (then still known as Cingular Wireless)
3. Cost significantly more ($500-600 w/ two year contract) than the average consumer paid for phones (almost always under $150 with contract, but usually “free”) at the time.
4. No App Store
5. No cut and paste
6. No removable battery
7. No physical keyboard (a positive for me, but was a deal breaker for so many back then)
That’s not to say the original iPhone wasn’t amazing in many ways, but let’s also remember the past accurately.
How many countries was the Nokia N95 available in VS. the 2G iPhone? I don't think it launched in Asia or most of Europe.
> How many countries was the Nokia N95 available in VS. the 2G iPhone?
Way more, especially since the original iPhone was only available in the US for the first 5 months. It was available across Europe, North America, South America, China, and Australia at minimum.
> I don't think it launched in Asia or most of Europe.
The N95 was heavily across Europe, that was the primary market for it in fact.
So you can see how it might not be a fair comparison?
Sure do. But it’s also a quite a bit more expensive phone too, which helps level the playing field some. Either way, there will never be a perfect apples to apples comparison.
That said, there is sufficient evidence to support my claim made in my original post.
Is the claim that sales numbers are the only way to measure success? And since the 2g iphone didn't measure up in that department it doesn't qualify as a success?
If you have a counter claim, especially one you can back up with as much facts as I did, please do so. Otherwise, please either stop straw manning or find some other place to do so.
The iPod, iPad and Apple Watch are all products from Apple where the first version didn't take off. I'd say they did just fine and the iPhone is largely an outlier in Apple's history of new products. Even the initial iMac suffered relative to its later revisions.
> It's the crappy version 1. Just like iphone and ipad v1. They sucked.
iPhone 1.0 was incredible. There was nothing like it. iPad 1.0 (and following) has been lackluster. AVP is impressive, but lacking.
The iPhone changed the world of tech in an instant. There were aspects lacking (slow internet, no copy paste, no third-party apps), but saying it sucked is rewriting history. The things you take for granted about phones came from that.
iPhone 1 was way more successful than Vision Pro, and it didn't suck relative to what was on the market at the time. At launch, Steve Jobs famously said it was 5 years ahead of the competition, and contemporary commentators generally agreed.
In its first week, Apple had sold 270,000 iPhones domestically.[47] Apple sold the one millionth iPhone 74 days after the release.[48] Apple reported in January 2008 that four million were sold.
Also, the iPhone cost $500 at launch. At the time, that was expensive for a smartphone, but even adjusting for inflation it was nowhere near Vision Pro-level expensive. (It would also be a relatively cheap phone in today's market.)
If anything, the Vision Pro feels to me more like the original Mac: an impressive technological leap forward, with lots of interesting ideas about computing and UI paradigms, but also prohibitively expensive, and still underpowered relative to its lofty ambitions.
Notably, the Mac didn't really end well for Apple. Eventually we got the iMac and OS X, but in between was a decade in which Apple nearly went bankrupt. And I'm not really convinced the Vision Pro is as innovative or compelling as the original Mac was to begin with.
>and it didn't suck relative to what was on the market at the time.
That's going to depend on what things you cared about. The original iPhone was heavily criticized for no copy/paste, no 3g service, no MMS, no physical keyboard, its absurd at the time $700+ price tag, carrier exclusivity, lack of subsidized pricing model and number of other things. Plenty of commentators thought Apple had widely missed the mark and had just launched a multi-million dollar folly that was sure to sink them any day now.
Phones have more mass appeal which I think attributes to the larger initial numbers. It doesn't change that the original iPhone was not great in a lot of ways. I had one - 2.5g was slow, the screen was small, and it was missing basic features. But it catalyzed what the future was going to look like.
Interesting. Macrumors reports 200.000 sold vision pro's a few month's ago. So maybe 300.000 today?
It's a type of gadget that hasn't become widely adopted yet and the usecases and killer features are almost non existent compared to the iPhones phonecalls + web browsing, music, videos, notes and many others.
Really hard to gauge what success means here, but if we say that in a year it will sell 500.000 units, that's 1/8 of the original iPhone, seems ok, or maybe not?
there is no killer feature where it sees mass adoption.. 99.9% of the population cant afford to drop $3,500 on a computer screen for their computer.
99.9% of the population can’t afford to drop $100k on a sports car. They still exist.
The first Apple Mac was $7500 in today’s dollars.
Armchairs are way over-indexing on price.
Idk, people spend absurd amounts of money on various hobbies and other pursuits that I bet a much larger % of the population can afford a Vision Pro than you might think. We don't really question when someone buys an ATV or boat that they use only a few times a year and easily costs as much as a Vision Pro.
Not comparable. I paid $7k for an upright piano (which is a rookie number not worth bragging about) which is my biggest purchase other than a car so far, plus ongoing $90 weekly lessons, but I won't ever regret because it is a very meaningful and valuable investment -- the piano easily lasts a decade, I practice every day and am happy about what it brings. People who blow $100k on a Steinway think the same. Vision Pro? Not a chance, even as a one-time purchase. Maybe after I have a big house and earn $1m in annual income and have too much money to waste.
I know people that make not much more than median income and easily blown thousands a year on hunting trips. Think of how many people spend a ton of money on a truck that they only use for normal commuting. People easily spend thousands a year on hobbies and the APV easily fits into that.
> boat
Brunswick Corporation has a market cap of about 6 billion.
But that just makes it a bigger success right? Adjusting for the crazy price it's even more impressive if it sells almost 1/8 in the first year.
My impression is that it's going to fall in price in the next iterations though i agree with you right now it's not even targeted for the masses.
> computer screen for their computer
You're making a plenty convincing argument- why misclassify the device? It's a full computer.
Yeah, not holding my breath for a Vision Pro II to see the light of day.
> very obviously
And yet people buy v1. It really depends on how much your time is worth. I bought v1 and I expect to sell it for $1000 or so when v2 comes out. $3000 to use this product for 12-18 months is totally worth it to me.
So, not “obvious”. At least to people with different priorities.
I think there are more than enough higher income people who would pay 5k just for a thing to watch a movie in private, with much better immersion than any alternative, on a plane.
> watch a movie in private, with much better immersion than any alternative, on a plane.
As someone who’s worn mine to watch movies on multiple flights, the problem is two fold.
1. The device is ridiculously hard to get into “travel mode” on a plane. Especially if the device was powered off previously and you have to enter a passcode. Each time the “tracking was lost” notification is shown it forces you to start over with your passcode from the beginning. Those who believe in better security than a four digit passcode are brutalized. Then just getting control center open and selecting travel mode (needing like five pinch operations) can be insult to injury. I can’t imagine going through that in economy in tightly packed seats. After going through that experience twice, I now insure it’s ready to go on the ground before boarding, but that’s also a hassle.
2. Wearing the device for the length of a movie is still a struggle. I have a ton of time on other VR headsets (which I also can’t wear comfortably for 2hrs), so this isn’t just a “getting used to it” thing. Unlike the previous problem, this one isn’t really solvable without different hardware.
That said, once the movie starts, it’s the best movie experience on a plane ever for the first 20-30min.
I’m one of the rare people who doesn’t have an issue with wearing vr headsets for great lengths, but I suspect that’s because I strengthen my neck for jiujitsu and that bleeds over into endurance with headsets.
Too much to ask for the average user, but the problem can be mitigated by the individual.
It’s not my neck that is the issue, it’s as much or more the pressure against my face.
I recently gave it a try and it immediately prompted me to turn on travel mode after putting in the passcode (though, yes, that part was difficult).
Weird, definitely didn’t for me. Wonder what’s different between us?
That's a bummer, wonder if a third party strap with a different weight balance could make it comfortable enough.
Software sells systems is the motto.
Apple is enamored with vertical integration which gives them control on a whole other level compared to their competitors; feels like history repeating.
What's different with AVP compared to previous products is that it starts off even better thanks to Apple's own custom chips. There's also the amazing network effects of their ever-growing ecosystem.
Competitors don't have all this, so they will struggle to compete on the high-end. The intention of Apple is clearly indicated by the price of AVP, they want the profits at the top, let the rest fight over the scraps at the bottom with crummy privacy-invasive software and poor integration/interoperability.
It also 'starts off better' because they refined its components throughout the rest of their ecosystem over the last half decade (or more?). If you look at a variety of unprovoked UI changes in iOS and tvOS, or hardware changes in iDevices, they now look like field tests at scale for learning, before bringing together these new, now proven, things.
It's a way of development seen almost nowhere else.
Or I'm giving them too much credit ... but I don't think so. I think it's evident they seeded hard parts throughout the rest to learn at massive scale.
I like that Apple is focusing on 3d widgets as an app primitive but is it really that hard to put that into Oculus/Android? Android actually does have widgets. What about the OS precludes it from what Apple has done?
There's some hard decisions around forcing everyone into their custom material that Apple made so that they can handle the rendering more deeply....but is that really a core OS thing? Seems like it doesn't need a new kernel for that.
It’s not so much difficulty as system architecture. Oculus just doesn’t have an OS layer, at least not in the sense of a platform that helps applications share resources and interact with each other.
The Oclulus platform is more like a classic video game console; there are system APIs, but they are designed to be used by single-tasking applications.
And for the user, the Oculus system UI is really an app launcher /task switcher.
It’s not better or worse, just a very different design philosophy.
Huh? I'm very confused as to what you mean. It's a customized version of Android. All the Android multi-tasking and app pause and resume life-cycle stuff should still be in there. Most of their ecosystem is heavy duty games that use all the device's resources (kicking out other apps from the working set), but it's definitely a multi-tasking OS.
I really do not think 3D widgets would require an entirely new OS. The main app switcher would need a revamp and they would need to re-purpose or build out some new app life-cycle callbacks to handle widget focus and interaction but it all seems very doable and not much harder than what they've already done.
Too me it sounds like what's being described is just a 3d desktop. But, in the walled garden world we live in most people just call the os/desktop combo the "OS" when it comes to android/ios.
Damn. Like I never thought of it that way. You need that OS layer. That should be metas core competency if they want to win. Games are something that runs on top of other people’s platform. I thought Zuckerberg did all this to stop being a layer on top of somebody else’s stack but all they did was the exact same thing with Oculus.
That is what always bugged me about the pivot to “meta”. They never had to find product market fit to succeed. They were never hungry. They could just throw money until something clicked… but money alone doesn’t make a revolutionary product. You need somebody hungry enough to see the world in a different way and then execute the fuck out of it.
Dunno how this relates to apple though. They have equal amounts of cash to throw at problems until they are “solved”. Perhaps the “operating system” is a solved problem already to some extent and maybe there isn’t anything truly new?
I get what you are saying, but that is why the Vision Pro is still an over engineered dev kit for a half baked OS.
At least the Meta Quest for example has a lot of content and VR games. The Vision Pro doesn't seem to have much use apart from it curiosity, because such system hasn't been fully built out. It seems like a device that isn't really ready for prime time for a couple of years yet.
Thinking about it. If apple had dropped their vision pro with something like you can play Half Life Alyx on it like they did with Death Stranding with the M2chip/M3? they might have had a larger buyer pool.
They would have needed controllers and actually cared enough to support steam on it.
There is zero chance Valve would release HL:Alyx without full steam support on the device.
That being said, I get what you're saying - that a killer game could have helped the value proposition. They clearly didn't design it for that though, even based on how much lower their refresh rate is for hand tracking.
It feels like a consumption device like the iPad, with some productivity mixed in.
The irony is I can use steam link with an ipad. On top of that I can use it as a second monitor for when I am on the go among some other productivity. From what I saw with the vision pro, nothing compelled me in that department. And I have to agree with you, not having some sort of controller interface was an additional no go as well.
This was my immediate takeway from using Quest 3 as well. Zuck has stated forever that he wants a platform, yet when it comes time to do the hard work, we just end up with an Android distro running a React app.
Could be part of Qualcomm too. I wonder if they're willing to release source code for their drivers if you want to get really low level.
And how many billions did they spend on it?
I was saying the same thing back during the first five years of the iPhone. There were so many ostensibly serious people who thought that BlackBerry or Nokia would have an “iPhone killer” just around the corner, and it’s like… do you chumps have any idea how difficult it is to build an operating system?
Well BlackBerry eventually did acquire QNX. But this was in 2010 which was far too late…
As a developer, I am acctually very happy that Meta went with Android. Reusing all the knowledge and tools is just great...
He talks about how "Gaze & pinch is an incredible UI superpower and major industry ah-ha moment" but... if that's really the case, then it's quite an indictment of the VR industry:
> The hardware needed to track eyes and hands in VR has been around for over a decade, and it’s Apple unique ability to bring everything together in a magical way that makes this UI superpower the most important achievement of the entire Vision Pro product, without a shadow of doubt.
So they had all the pieces, but only Apple put it together and realized that you'd need a VR equivalent of point-and-click? If that's actually true, it's sad.
It's almost exactly the same kind of conceptual transition that Apple made happen with keyboardless smartphones, too, which adds an extra sort of funny element to it.
Putting it together is not as simple as it seems. I think it was an immense engineering and design effort from Apple to get it to the point where it feels effortless and obvious
Not only do they have two cameras per eye, and all the hardware for wide angle out-of-view hand tracking, they had to consider:
Privacy: the user’s gaze is never delivered to your process when your native UI reacts to their gaze. Building this infrastructure to be performant, bug free and secure is a lot of work. Not to mention making it completely transparent for developers to use
Design: they reconsidered every single iOS control in the context of gaze and pinch, and invented whole new UI paradigms that work really well with the existing SDK. You can insert 3D models into a SwiftUI scroll view, and scroll them, and it just works (they even fade at the cut off point)
Accessibility: there is a great deal of thought put into alternative navigation methods for users who cannot maintain consistent gaze
In addition to this they clearly thought about how to maintain “gazeable” targets in the UI. When you drag a window closer or farther it scales up and down maintaining exactly the same visual size, trying to ensure nothing gets too small or large to gaze at effectively
There are so many thousands of design and engineering decisions that went into making gaze and pinch based navigation work so simply, so I can understand how it hasn’t been done this effectively until now
> So they had all the pieces, but only Apple put it together
Its very difficult to change a mindset or culture in big companies. Existing VR companies were too invested in using a controller. Similarly back in the early smartphone days all the big companies thought that smartphones must have physical keyboard.
Sometimes you really have to decide to take the risk and ship without a standard controller before you can see if the new model will work. The iPhone was famously derided for a lack of stylus. Video game consoles for years have tried to incorporate motion controls in some form or another and realistically only the Wii succeeded in any measure because they ditched the classic controller instead of trying to shoehorn it in. Many times it doesn't work, but if you give developers and users an "easy escape hatch" to go back to what they're already comfortable with, so many of them will default to that no matter how much better your new option might be.
On top of what the others said, there are two other closely related issues:
If everything is designed for your controller, the eye interface may not work well due to lack of software optimization.
Which means it’s just an expensive battery hogging extra weight you don’t need.
Maybe they realized they needed it, but Apple actually pulled it off.
Apple has a stronger combination of hardware design, software implementation skills, and UX expertise, than any company in the world.
If what you said were true then this is a fatal strategic error on Meta's side.
This entire time, they could have built a real OS, solidifying their first mover advantage.
Meta makes social media apps. Where as writing operating systems is Apple's core competency. Both companies are playing to their strengths.
The problem is that by doing that they’ve limited their device’s usefulness severely.
If some kind of killer AR app shows up on the Vision Pro, could it be put on the Quest? Let’s just assume it doesn’t need a level of processing power that the Quest can’t deliver. Would the software vendor just have to implement the entire interface from scratch or with Unity or something? Are there enough platform components on the Quest to be able to do the job?
I don’t know the answer. But I did see a number of developers mentioning online over the last year just how incredibly easy it was to get started with the Vision Pro compared to the quest. If you have a Mac you sign up for the Developer program for $99 and you get an IDE, compiler, simulator, performance monitoring, full UI library plus documentation. It’s early days for some of that stuff, but all the batteries are included. From what they said it was far far easier to get to “hello world” than on Meta’s platform.
The funny thing is that the social media app for Oculus (Horizon Worlds) is total dogshit. The third party VRChat is far more successful.
Apple leveraged their existing OSX OS stack, for Meta this would mean either heavily forking android OR starting their own OS. Both would take 5+ years to get meaningful traction. Remember google fuchsia, the code-repo was public in 2016, intial release was 2021, and it's still not anywhere near where it'd need to be for a VR headset.
I think that kind of under sells it. Yes Apple had all sorts of existing technology they could leverage. But they still built a completely new spatial UI paradigm for it.
And an entirely new interaction model that hasn’t been seen before. Using looking at something to replace a mouse isn’t new but taking that combined with using a pinch gesture to “click“ and some of the other things they’ve come up with is a unique combination that seems to work quite well. Thought there is certainly room for improvement.
Didn't oculus have pinch to click before Vision Pro came out?
I own one of each, and develop for the Vision Pro through my job, it's the very same story it's always been. Apple hasn't 'invented' much here, but the magic is in how it's assembled, even in its current state, using apps in a 3d space feels better than anything the quest has ever done. Even simple things like 'touching' a panel just feels more natural on the vision pro than the same experience on the quest, mostly because the quest does things like forcing the ghost hand to stop at the surface of the window, instead of continuing to track your hand through it and just using the intersection as the touch point. It's a small difference in the interaction that makes a world of difference in usability, which Apple is very good at.
Every game console comes with its own operating system, even though Sony and Nintendo are not in the OS business.
Just take FreeBSD and add your own UI on top.
To be fair, you probably don't have to build your own kernel like fuchsia. You can almost certainly start with a bare bones freebsd or Linux kernel or what ever. You're still making a custom gfx layer and lots of user land but you're get a lot for free too.
VROS is already an Android fork.
VR needs performance..Android just cannot and will never deliver it.
They started to, and then they gave up on it.
https://www.engadget.com/meta-dissolves-ar-vr-os-team-204708...
That operating system (a hard fork of Google Fuschia) was really designed for low-power AR wearables and made almost no considerations to supporting VR (or like, any existing software, which was one of the major drawbacks). Too many systems designed from scratch with no compatibility with traditional OS APIs. I don't think it would have been viable even with 5+ more years.
> Apple has made a fully realized spatial operating system
I'm not sure what you mean precisely. Apple doesn't seem to have done more than windows with persistent positions. This isn't nothing, but it's also not something that has tremendous value for a headset that you only wear 30 or 45 minutes at a time.
And they have little to no management of these floating windows. I'm really not holding my breath for Apple to come up with breakthrough windows management given what they've done for the past decade.
If you don't think in term of potential and promises, but of actual value to the user right now, I'd understand why Meta hasn't the gimmick.
Is this a big lead for Apple ? Perhaps, the world mapping could be something difficult to reproduce. Or Meta could be at roughly the same point but decided it not to go there.
It's kind of fascinating, because as you say, many of the capabilities are barely surfaced in the user layer yet. But the fundamentals are there.
Take a look at this Reddit post for example:
https://www.reddit.com/r/VisionPro/comments/1ba5hbd/the_most...
The user is pointing out that that the real fridge behind them is reflected by the surface of the virtual object in front of them. And consider on top of that, the fridge is not visible to the headset at that moment. It is captured in the 3d spatial model that was created of the room. None of this is a pre-rendered or rigged or specifically engineered scenario. It's just what the operating system does by default. So one app that is totally unknown to another app can introduce reflections into the objects it displays. This is just so far beyond what can happen in the Quest platform by any means at all. And it can only happen because the 3d spatial modeling is integrated deeply into the native rendering stack - not just layered on the surface of each app.
I'm with you on how incredible it is from a technical perspective.
The most fascinating thing to me is how we've gone from Apple being the pragmatic and real world product focused company, moving slower but making sure what they ship has undeniable practical value, and not promising much beyond ("we don't talk about future products and roadmap").
Compared to the "wow look at that technical prowess, not much useful right now, but such potential !" that we're getting with this device. I don't see it completely fall flat, but it feels it's on the same course as the Apple Watch or the HomePod, to be the biggest in the niche category it defines (whatever it ends up be), and a smaller presence in the general space ("smart eyewear ?") with better fitted and more practical devices taking 70% of the market.
The clunkier XReal probably keeping chugging along, being to the AVP what the Xiaomi or Huawei smart bands are to the Apple Watch. And Meta probably being the Samsung shooting at the target from 5 different angles.
completely agree!
I keep seeing people trying to shove this into the narrative of "Apple coming late but doing it better and solving real problems". But it doesn't fit that narrative well. Apple here is early to something else that just happens to look like the thing that people are viewing as the predecessor, and it's utility is highly questionable and full of all kinds of weird gaps. In a telling kind of way, they are actually in part leaning on the aspects they are not trying to sell (Here, look at these immersive experiences! But shhh don't call it VR) - to paper over the fact that the core of what they are really trying to build is just not ready yet.
Its just a skybox with the cubemap of the room, no? The Quest 3 does a full room scan and can you get a 3d mesh of the room as well. They could provide a cubemap as well.
Meta doesn't want to pass any of the camera feed to an app for privacy reasons so they don't make it available but its a legal issue not a technical one. They can (and should) do this for the browser model viewer.
Apple enforces a single material model so they can inject the lighting data in a uniform way. Its a bit of a nuclear option to have a fixed shader pipeline. But I digress...
Anyway, its not as out of reach as you claim.
>So one app that is totally unknown to another app can introduce reflections into the objects it displays
Reflections of the real environment on virtual objects do not imply a global scene graph, or the ability of apps to reflect one another. It just means that each app gets fed the environment map that mostly had to be generated anyway because of SLAM. Neat but not some kind of radical form of IPC.
Other people will come up with the right way to do spacial windowing… but they’ll fuck up somehow and Apple will take it, refine it, polish it, and “win”
I have negative confidence in Apple's ability to design a windowing system. Historically they have never once shipped a good one, and only occasionally have they shipped an acceptable one.
That's Apple from 2 decades ago.
Current day Apple and its management launches products with weird twists ("send your pulse to your loved ones", "this method of input will stay in history alongside the mouse"), to then cut it down to only the features people really care about, shut it out from competiting OSes, still get the competition to execute better on that limited set of features, and get sued and forced to remove their most advanced feature after months of media humiliation.
That's why people get back to the first iPhone 17 years ago when they want to predict a bright future for the AVP. I still want to see it push the field forward, but odds are not in its favor IMHO.
And as an educator who thought an VR-development course at university for a few times since 2018 setting up and maintaining 15 Oculus Quest Mk I glasses was an absolute pain, with accounts that I have to setup, etc. Sure it worked somewhat like a android phone, but there was no real fast pass for users like me, a lot of the features and the UI changed over the time and it ultimately felt like the platform took itself too seriously and therefore had no problem wasting my time.
When designing a concept the core difference is always whether your design respects the user or whether it does not and tries to make them do things, spend more time on the platform, spend more money on the platform, etc.
I think that’s probably the reason for why they are not talking about it and focusing on specs instead, they know.
To support copy&paste all you need is a common format for 3D objects, like HTML is for 2D documents; glTF might be a reasonable candidate.
The problem with the Apple approach is that there are no apps and games, and there probably won't be many given it's a 3500$ device with few users that Apple exerts its tyrannical grip over (or if there will be, they will be ports of Unity/Unreal, PC or Android VR apps, not using any of the special features that the Apple OS may have).
People have been trying to solve this problem since the VRML days in the '90s. I suspect you are underestimating the complexity of 3D data by a pretty huge extent.
3D is much, much, much more complicated than 2D, especially if you're trying to interchange between arbitrary applications that may have divergent needs.
Start with this little thought experiment.
What do you mean by 3d object?
Do you mean a set of polygons, like in a traditional triangle mesh? A volume, like voxels? A set of 3d surfaces?
Do you need to model interior details, or only the exterior envelope? Do you need to be able to split or explode it at some arbitrary level of detail? Do we need to encode sharp edges or creases in some way?
etc, etc, etc
This is before you have touched materials, texturing, lighting, any of that.
The basic math is also dramatically worse, and much more challenging for entry level participants to grasp. The basic idea of matrix math grows dramatically more complex just in terms of sheer number operations. You might be able to get away with your entire app only needing a couple mults or adds for most interaction in 2D. Basic 3D, closer to 150 mults (usually 3 [4x4] mults at 49 mults per [4x4] pair) for the default.
In 2D apps and games, you can often get away with incredibly simplistic calculations. Rarely much other than a translation. Many times 1D with naively obvious solutions. With 3D, you rapidly need to move to 4D matrices than can handle arbitrary scale, translate, rotate, perspective, clip volumes.
Nearly every 3D app anywhere has to handle the model, to world, to camera, to clip space path, which involves a lot of complex math well beyond most 2D apps. Usually, 3 different matrix mults with 4x4 matrices.
It's one of the main reasons voxels have been the only real 3D implementation with large scale use. The amount of work necessary to develop, ... really anything that works with 3D is a large step upward in difficulty unless its totally regular and square. Otherwise, huge numbers of optimizations are no longer available. Plus, the compression cliff for normal users of effectively arbitrary 3D shape design and movement is really steep. Most first time users of an industry 3D CAD package (ProE, Solidworks, AutoCAD, Maya, 3DSMax, Blender, ect...) or similar have the "wall of difficulty" moment.Edited: dumb math error
Stupid question… isn’t all that wrapped in a container?
Like in HTML with divs? So if you have a virtual lamp you want to copy, all the various elements that make up the lamp are in a VR equivalent object markup as <div id=‘lamp’ />. If you want to copy specific elements you can, but you’d have specific actions, e.g copy color etc.
Maybe I’m missing something though and it’s more complex.
Don’t think about tag soup. Think about how you’d encode the geometry, and what that even means for anything but the most trivial object, like a cube.
Ok now, I have a few separate colored lights pointing at the lamp. What color is the lamp? How about when it’s on?
Interesting. I assume you’d copy the object not the lighting. If you move a real object into a different room or outside you don’t expect the colour to stay the same. But if I’m pasting it into a word doc, I guess intuitively I might want it to look the same as source, but that breaks down unless I copy the whole virtual universe. A mirrored globe inside a room of mirrors is not going to look the same unless I copy the scene. People will learn this and maybe demand eg a choice of the copy scope. You might not even have rights to copy all objects. I can’t just copy your palace and paste it into mine unless you allow it (say).
That’s the entire crux. What exactly the hell is “the object”, and how does your target program DO anything to it, to include manipulating it, or getting it out to a GPU for rendering in some even vaguely terrible way - and you’re probably trying to render it 120 times (or more) a second with a decent level of resolution.
3D is not like tabular data. There isn’t some default resting state it naturally wants to exist in. It’s all edge cases and special logic. Also, it’s mind bogglingly vast amounts of raw data. Even a simple scene can contains hundreds of thousands of surfaces/polygons/spline patches/voxels/whatever representation.
A VR environment is essentially running a high end 3d game engine at all times.
Isn't this already a solved problem? See 3D apps, i.e. Blender, Unity, Unreal etc. You can copy objects easily.
I don't think the intention is to copy an object with the baked in lighting at the time of making a copy. I can imagine you just want to copy the object and the behavior how it reacts to whatever environment it's placed in.
Not between them.
Any scene description format would work (but please not obj or stl), it would be good to see a standard emerge for this, though. USD might be it.
I really hope USD is not it. Having worked with it trying to build stuff for the vision pro, it's a massive software library pretending to be a file format. It is inexorably linked to the source code that processes it to the point that making a processor for the format is a non-starter, and that seems to be by design.
The codebase is dense, hard to compile, using outdated dependencies, and doesn't play nice with anything else. Documentation is sparse, often incorrect, and severely lacking in anything like a new user guide. Everything assumes you work for Pixar already and know the ins and outs of their pipeline.
^ exactly this. The problem with the Vision Pro is that there is nothing to do, whereas the Quest 3 is driven by cool experiences
Tyrannical grip on users who paid 3500$? That indicates no understanding of said users.
“This all inclusive resort has a tyrannical grip on all the poor visitors who spent $10k to travel and stay at an all-inclusive resort!”
It goes back to what I have mentioned elsewhere in this thread that Apple always thinks product first. Hardware specs are only ever in service of the product Apple is trying to deliver. If they could never list specs, they wouldn't. The industry forces some capitulation which is why Apple ever talks about specs at all.
Oculus is a gaming device and doesn't have a "productivity" ambition. I believe it is because 3D glasses has very limited productivity application. But who knows, for people who think that 13" laptop is okay for work, Vision Pro may become something better for comparable price.
> I believe it is because 3D glasses has very limited productivity application.
AR has tremendous productivity applications if the device is small and wearable enough. Imagine being up in your attic running cables and seeing a projection of the floor plan of your house so you can see where the different rooms in your house are. Or driving a car, except all the blind spots disappear and are filled in with vehicle-mounted camera feeds, with unobtrusive overlays for navigation or to highlight potential safety hazards. Imagine assembling some IKEA furniture except instead of puzzling through the instruction book, you have an app that can recognize all the pieces using machine vision and simply show you what to do. Imagine never forgetting a name or a face, because every time you see even a distant acquaintance, your glasses can run facial recognition and make their name pop up by their face in real life. Imagine noticing a weird rash on your arm, but as soon as you look at it, your glasses immediately diagnose it as a potential MRSA infection and pop up a notification allowing you to call an urgent care clinic that’s open right this second.
>Imagine noticing a weird rash on your arm, but as soon as you look at it, your glasses immediately diagnose it as a potential MRSA infection and pop up a notification allowing you to call an urgent care clinic that’s open right this second.
This isn't really an application of 3D glasses, it's a "diagnose my rash" smartphone app. The 3D adds nothing. Your other examples similarly lean more heavily on the "always-on camera with problematic network connection" aspect, than 3D.
The 3D is going to make the user experience a lot more seamless though.
> Imagine never forgetting a name or a face, because every time you see even a distant acquaintance, your glasses can run facial recognition and make their name pop up by their face in real life.
This could work if they weren't wearing a VR helmet themselves.
I keep saying “glasses” because eventually the technology is going to get miniaturized to that extent, and you can facial recognize people wearing glasses. But you could also have a handshake protocol for the devices themselves.
Oculus has expanded its applications beyond just gaming. There are also productivity applications and tools available for Oculus devices like virtual desktops
I perceive it as an experimental sector. They want to keep a finger in the VR productivity pie; moreover, they want to keep their fingers in all VR pies. But if you check out their primary commm channels - email newsletter, ads etc - there's nothing about productivity there. In their app store it is a small insignificant and sometimes hard to find corner.
So horizons workspace is for... ?
Or the quest pro for that matter, even though it flopped.
Is for measuring the temperature in the ecosystem. It is very basic app produced by a skeleton team (by Meta measurements).
> Apple has had the advantage of green field engineering it exactly how they want it to be from the ground up.
It's not quite green field on the software side, albeit mostly. Clearly they already have experiencing re-platforming a whole operating system multiple times. The underpinnings of macOS power everything from desktops to smartphones to watches to tablets already all with diverging user interfaces. They had a solid first-party foundation to build the interface they want; Facebook is ultimately a third-party to Android and is having to solve the same Android hardware integration problems as everyone else.
> Think about what has to exist, to say, intelligently copy and paste parts of a 3D object made by one application into a 3D object made by another, the same way you would copy a flat image from photoshop into a Word document.
I own an AVP and this isn't something that can be done with it, to the best of my knowledge. Please explain how this is possible with the existing OS and apps.
I have seen people download 3d model file formats (stl) and position/scale them in front of them, and then walk around the 3d model. I am not sure if they added anything to the Vision Pro but it was pretty impressive. I would not be surprised if it can handle common 3d formats and render them straight to your AR environment out of the box.
Possible with the existing OS? Definitely. It’s just clipboard data
Possible with the current apps? None of them support a standard partial copy of 3D objects but they do allow copy pasting full objects between apps afaik. E.g I can drag a USDZ file from a message into keynote
None of this matters compared to the number of apps available. Given the high price of the Vision Pro and the resulting low sales, it would make little business sense for app developers to invest in creating apps for it instead of for the Quest 3.
This is a really interesting point and plenty of products have died on the hill of not realising that "content is king".
What makes it particularly interesting is that the VisionOS app store so far seems to have had quite an anemic reception from developers. Barely any novel non-toy apps have been released for it, with 8 months since devs got access last year and 2 months in the open dev ecosystem. It's possible the tsunami is just around the corner but it would have to be said that this seems to be diverging heavily at this point from the launch of the original iPhone app store. It was always going to be a question since the user base is miniscule compared to every other headset and iOS devs mostly have negligible experience in developing full scale VR / AR applications which is actually a very steep learning curve. So the barriers are high and the incentive relatively small.
If Apple fails to attract devs to its store it will create a huge problem for them that they are pretty unused to having. I wonder how they will approach a situation like that, since their culture is not used to dealing with that as a problem these days.
> None of this matters compared to the number of apps available.
There's a reason that iPadOS launched multi-window mode of arbitrary curved corner sizes with the same curved handle bar on the curved corner, several iPadOS iterations back.
Doing that ensured that VisionOS could launch opting all iPad apps in: most any iPad app respecting the ability to run in iPadOS's "stage manager" mode with multiple windows, works beautifully OOTB on VisionOS.
In fact, any iPad app run on Vision OS, if you "pull" the app close to you iPad sized, you can touch it as if an iPad screen, and your fingers and touch work as if touching an iPad.
The only apps that don't work as if native are those doing something special with multi-touch or touch gestures, but most apps "just work". It's pretty wild.
Press keeps comparing Vision Pro to MacOS. No, the 2D pass-through mode is a room sized iPad stage manager, infinite iPads.
> Given the high price of the Vision Pro and the resulting low sales, it would make little business sense for app developers to invest in creating apps for it instead of for the Quest 3.
For sure. That’s exactly how it’s played out in iOS vs Android. No developer makes anything for the higher priced, small market iOS, right?
Perhaps it matters because of apps. With OS that natively supports spatial features it could be easier to expand functionality of existing ios apps or interact with them in the ar/vr context.
This is a keen insight I had not until now appreciated. Thank you.
it is high time to stop praising apple for their software ecosystem. i am still impressed by their level of engineering but it is not doing much in terms of real use-case since the ios days.
case in point - the whole ipad and "what is a computer" campaign. it is hilarious when a half-baked mouse support is celebrated in a tablet. despite using similar hardware, apple refuses to treat their tablets up to their true potential.
despite working on an RTOS for avp, there is no signs that the headset stack will be exploitable by professionals, like it used to for macs. for the coming future it will remain to be a good software demo built on top of nice displays.
I worked on an oculus team for close to a team that was charged with building a platform. The trouble at oculus was that there were multiple waring platform efforts.
I'm kinda shocked that neither the Vision Pro or the Quest allow for actual AR development since the passthrough cameras aren't available to 3p devs.
Well, my desktop OS still treats my GPU as a second class citizen. In fact I'm not even sure if my OS has the concept of a GPU built in.
So OS is probably not as important as you think.
I think they mean OS in the broader sense than just the kernel.
> between Android underneath and Unity/Unreal at the application layer.
So they want to build a new kind of device and a new kind of experience, and they seriously think they can do that by just plugging together ready-made parts built by others? No wonder this is going nowhere.
I think you might be giving Apple too much credit for strapping the iPad OS on your face.
Granted, an iPad is better than an app launcher, but so far I don’t think the software is really “killer” in any specific way.
Most of the in depth reviews I’ve seen mostly praise the screen resolution and the movie experience.
Apple has made a fully realized spatial operating system
Said that out loud to a group of techies and they laughed so hard one of them fell out of their seat.
Apple put the iPad on your face. And that's pretty much it.
The few VP users that haven't returned the device don't use any of the "spatial" features like controlling the UI by pointing in space, since it's so inaccurate that it gives Swype a run for its money.