Together with next-generation ML accelerators in the CPU, the high-performance GPU, and higher-bandwidth unified memory, the Neural Engine makes M4 an outrageously powerful chip for AI.
In case it is not abundantly clear by now: Apple's AI strategy is to put inference (and longer term even learning) on edge devices. This is completely coherent with their privacy-first strategy (which would be at odds with sending data up to the cloud for processing).
Processing data at the edge also makes for the best possible user experience because of the complete independence of network connectivity and hence minimal latency.
If (and that's a big if) they keep their APIs open to run any kind of AI workload on their chips it's a strategy that I personally really really welcome as I don't want the AI future to be centralised in the hands of a few powerful cloud providers.
> In case it is not abundantly clear by now: Apple's AI strategy is to put inference (and longer term even learning) on edge devices. This is completely coherent with their privacy-first strategy (which would be at odds with sending data up to the cloud for processing).
Their primary business goal is to sell hardware. Yes, they’ve diversified into services and being a shopping mall for all, but it is about selling luxury hardware.
The promise of privacy is one way in which they position themselves, but I would not bet the bank on that being true forever.
> but it is about selling luxury hardware.
Somewhat true but things are changing. While there are plenty of “luxury” Apple devices like Vision Pro or fully decked out MacBooks for web browsing we no longer live in a world where tech are just lifestyle gadgets. People spend hours a day on their phones, and often run their life and businesses through it. Even with the $1000+/2-3y price tag, it’s simply not that much given how central role it serves in your life. This is especially true for younger generations who often don't have laptops or desktops at home, and also increasingly in poorer-but-not-poor countries (say eg Eastern Europe). So the iPhone (their best selling product) is far, far, far more a commodity utility than typical luxury consumption like watches, purses, sports cars etc.
Even in the higher end products like the MacBooks you see a lot of professionals (engineers included) who choose it because of its price-performance-value, and who don’t give a shit about luxury. Especially since the M1 launched, where performance and battery life took a giant leap.
Engineers use MacBook pros because it’s the best built laptop, the best screen, arguably the best OS and most importantly - they’re not the ones paying for them.
> Engineers use MacBook pros because it’s the best built laptop, the best screen, arguably the best OS and most importantly - they’re not the ones paying for them.
I am the one paying for my MacBook Pro, because my company is a self-funded business. I run my entire business on this machine and I love it. I always buy the fastest CPU possible, although I don't max out the RAM and SSD.
Amusingly enough, I talked to someone recently about compilation speeds and that person asked my why I don't compile my software (Clojure and ClojureScript) on "powerful cloud servers". Well, according to Geekbench, which always correlates very well with my compilation speeds, there are very few CPUs out there that can beat my M3 Max, and those aren't easily rentable as bare-metal cloud servers. Any virtual server will be slower.
So please, don't repeat the "MacBooks are for spoiled people who don't have to pay for them" trope. There are people for whom this is simply the best machine for the job at hand.
Incidentally, I checked my financials: a 16" MBP with M3 and 64GB RAM, amortized over 18 months (very short!) comes out to around $150/month. That is not expensive at all for your main development machine that you run your business on!
> comes out to around $150/month.
Which, incidentally, is probably about 10x less than you would spend compiling your software on "powerful cloud servers". :-)
For a fair comparison, what about comparing against the cheapest "power cloud server"?
I mean Hetzner has a reputation for renting bare metal servers at the cheapest price in the market. Try AX102 which has very close performance to a M3 Max (CPU only): https://www.hetzner.com/dedicated-rootserver/matrix-ax/
While the OP's solution has a lot of advantages like being able to own the device and including GPU, but at least we do have cloud servers with comparable costs available.
Indeed! That server is very close to my M3 Max. I stand slightly corrected :)
Worth noting: the monthly cost is close to my 18-month amortized cost.
I tried a lot to use remote servers for development when I had an Intel MacBook and I found the experience to always be so frustrating that I upgraded to the M series. Have the tools gotten any better or is vscode remote containers still the standard?
I did use them several years ago, for Clojure and ClojureScript development. Docker and docker-compose were my main tools, with syncthing helping synchronize source code in real time, Emacs as the editor. Everything worked quite well, but was never as easy and smooth as just running everything locally.
vscode remote containers are still the standard, but I find them very usable nowadays. My setup is a MBP M2 that I use to remote into a Windows WSL setup at home, a Linux desktop at work, and various servers. Nesting remote SSH + remote Docker works seamlessly, that was previously a major headache.
In your case it makes sense to get the most performant machine you can get even if it means you're paying a ton more for marginal gains. This is not usually true for the general public.
General public can buy a M1 MacBook Air for $799 if they need a laptop at all. An air will serve them well for a long time.
Love my Air...
As a Clojure/ ClojureScript developer myself I just wonder what You do that compilation is such an important part of Your workflow and at the same time don't need as much RAM as possible? Yes, the MacBook Pro isn't bad at all for Clojure(Script) development. I was pretty angry that the Lenovo ThinkPad T14 Gen3 has a full channel of soldered RAM and just a single slot for expansion since I really use a lot of RAM and would prefer to go with 64 GB full dual-channel and not a hybrid 48 GB with 32 GB being dual-channel and 16 GB being single channel. (Yes, it does actually work.) Most builds that I do are done asynchronously using GitHub Actions or similar. Yes, it does take some time but the build+deploy isn't that time sensitive.
In addition to the hardware, the OSX software is so much better with flawless speed, productivity, and multitasking with gestures. Try doing the desktop switching on the windows. On a flip note, I would gladly use the cloud if internet speeds and latency comes down to negligible level - we developer are an impatient lot.
"Engineers" - ironically the term used in the software industry for people who never standardize anything, solve the same problem solved by other "engineers" over and over again (how many libraries do you need for arrays and vectors and guis and buttons and text boxes and binary trees and sorting, yada yada?) while making the same mistakes and learning the hard way each time, also vehemently argue about software being "art" might like OSX, but even that is debatable. Meanwhile actual Engineers (the ones with the license) the people who need CAD and design tools for building bridges and running manufacturing plants stay far away from OSX.
I did EE in college but we mostly just used Windows because the shitty semi-proprietary SPICE simulator we had to use, and stuff like that, only supported Windows. The company that makes your embedded processor might only support Windows (and begrudgingly at that).
I think engineers using software should not be seen as an endorsement. They seem to have an incredible tolerance for bad UI.
Is it truly bad UI?
They may be locked in, which just forces things. Not an endorsement.
However, they may also be really productive with whatever it is. This could be an endorsement.
In CAD, as an example, there are often very productive interaction models that seem obtuse, or just bad to people learning the tools first time.
Often improving on first time ramp ups to competence nearly always impacts the pro user too.
Where it plays out this way, I have always thought the UI was good in that the pros can work at peak efficiency. It is hard to beat them.
Fact is the task complexity footprint is just large enough to make "good", as in simple, intuitive interfaces not possible.
You seem to be suggesting that a chunk of the hundreds of millions of people who use a UI that you don't like, secretly hate it or are forced to tolerate it. Not a position I'd personally want to argue or defend, so I'll leave it at that.
What an oddly aggressive and hostile response to such a banal observation. Yes, millions of people use software they hate, all the time, that’s wildly uncontroversial.
Its not an "observation" its someone making it up. Why are you so upset if I disagree?
Making up what? Go drop by your nearby shop. My hair styling constantly complains about management software that they use and quality of payment integration. At work I constantly hear complaints about shitty, slow IDEs. At optician store guy been complaining about inventory system.
People hate software that they're forced to use. Professionals are better at tolerating crapware, because there's usually sunk cost fallacy involved.
There are only two types of software: those that people hate and those that nobody uses (a paraphrase)
<painfully earnest nerd>
Well actually, I use FreeBSD as my daily driver (on a used ThinkPad I bought for 300 euros), and I love it. :D
</painfully earnest nerd>
okay, now you're going to tell me that FreeBSD is in the "software nobody uses" category isn't it?
This is not a reasonable way to infer the sentiment of hundreds of millions of people in different countries, different business, different situations, etc, etc.
Disguising it as an "observation" is even more ridiculous.
Indeed I’m not ready to defend it, it is just an anecdote. I expected the experience of using crappy professional software to be so universal that I wouldn’t have to.
Sure, and this is where I will ask you post a list of "good" professional software so I can google all the bugs in that software :)
Nah, I'm good. Believe what you want to believe my friend.
>They seem to have an incredible tolerance for bad UI.
Irelevant.
Firstly, it's a tool, not a social media platform designed to sell ads and farm clicks, it needs to be utilitarian and that's it, like a power drill or a pickup truck, not look pretty since they're not targeting consumers but solving a niche set of engineering problems.
Secondly, the engineers are not the ones paying for that software so their individual tolerance is irelevant since their company pays for the tools and for their tolerance to those tools, being part of the job description and the pay.
Unless you run your own business , you're not gonna turn down lucrative employment because on site they provide BOSCH tools and GM trucks while you personally prefer the UX of Makita and Toyota. If those tools' UX slows down the process and makes the project take longer it's not my problem, my job is to clock in at 9 and clock out at 5, that's it, it's the company's problem to provide the best possible tools for the job, if they can.
Do you disagree with the sentence before the one you quoted? I think we basically agree, you came up with a bunch of reasons that
> I think engineers using software should not be seen as an endorsement.
> my job is to clock in at 9 and clock out at 5
Where can I find one of those jobs?
It was figuratively. Obviously everyone has different working hours/patterns depending on job market, skill set and personal situation.
But since you asked, Google is famous for low workloads. Or Microsoft. Or any other old and large slow moving company with lots of money, like IBM, Intel, SAP, ASML, Airbus, DHL, Siemens, manufacturing, aerospace, big pharma, transportation, etc. No bootstrapped "agile" start-ups and scale-ups, or failing companies that need to compete in a race to the bottom.
Depends mostly on where you live though.
France. You'll get a two hour lunch break too.
And you will usually leave at 18:30.
If you look at creative pros such as photographers and Hollywood ‘film’ editors, VFX artists, etc. you will see a lot of Windows and Linux as people are more concerned about getting absolute power at a fair price and don’t care if it is big, ugly. etc.
Oh, I'm sure there are lots of creatives who use OSX, so I don't mean to suggest nobody uses OSX, so I'll admit it was a bit in jest to poke fun at the stereotype. I'm definitely oldschool - but to me It's a bit cringe to hear "Oh, I'm an engineer.." or "As an engineer.." from people sit at a coffee shop writing emails or doing the most basic s/w dev work. I truly think silicon valley people would benefit from talking to technical people who are building bridges and manufacturing plants and cars and hardware and chips and all this stuff on r/engineeringporn that everyone takes for granted. I transitioned from s/w to hardcore manufacturing 15 years ago, and it was eye opening, and very humbling.
I’d assume a lot of this is because you can’t get the software on MacOS. Not a choice. Who is choosing to use Windows 10/11 where you get tabloid news in the OS by default? Or choosing to hide the button to create local user accounts?
Who is choosing to use macOS, where non-Apple monitors and other 3rd party hardware just stops working after minor updates and then starts working again after another update, without any official statement from Apple that there was a problem and a fix?
I do. Because for all issues it has, it is still much better than whatever Windows has to offer.
> where non-Apple monitors and other 3rd party hardware just stops working after minor updates and then starts working again after another update, without any official statement from Apple that there was a problem and a fix?
At least my WiFi doesn't turn off indefinitely during sleep until I power cycle whole laptop because of a shitty driver.
So what, Windows does the same. Printers [1], WiFi [2], VPN [3], Bluetooth devices [4], audio [5] - and that's just stuff I found via auto-completing "windows update breaks" on Google in under 5 minutes.
The only problem is that Apple is even worse at communicating issues than Microsoft is.
[1] https://www.bleepingcomputer.com/news/microsoft/microsoft-wa...
[2] https://www.bleepingcomputer.com/news/microsoft/microsoft-fi...
[3] https://www.bleepingcomputer.com/news/microsoft/microsoft-sa...
[4] https://www.forbes.com/sites/gordonkelly/2019/06/12/microsof...
[5] https://www.theregister.com/2022/08/22/windows_10_update_kil...
The big difference is that Microsoft - at least usually - confirms and owns the issues.
With Apple, it's usually just crickets... nothing in the release notes, no official statements, nothing. It's just trial and error for the users to see if a particular update fixed the issue.
That's anti-competitive and frustrating, but not an argument against the value of a pure Apple hardware ecosystem.
Which was not the point. The question was who would be choosing Windows over macOS. I would and this is one of the reasons why.
People overwhelmingly choose windows world-wide to get shit done. That answers the who.
So the same software exists on multiple platforms, there are no legacy or hardware compatibility considerations, interoperability considerations, no budget considerations, and the users have a choice in what they use?
I.e the same functionality exists with no draw backs and money was no object.
And they chose Windows? Seriously why?
More choice in hardware. More flexibility in hardware. UI preferences. You can't get a Mac 2 in 1 or a Mac foldable or a Mac gaming notebook or a Mac that weighs less than a kilogram. You can't get a Mac with an OLED screen or a numpad. Some people just prefer the Windows UI too. I usually use Linux but between MacOS and Windows, I prefer the latter.
We use the sales metrics and signals available to us.
I don't know what to say except resign to the fact that the world is fundamentally unfair, and you won't ever get to run the A/B experiment that you want. So yes, Windows it is !
You seem to have some romanticized notion of engineers and deeply offended by someone calling themselves engineer. Why do you even care if someone sits at a coffee shop writing emails and calls themselves engineer? You think it somehow dilutes prestige of word "engineer"? Makes it less elite or what?
"deeply offended" - My default response to imposters is laughter. Call yourself Lord, King, President, Doctor, Lawyer whatever - doesn't matter to me. I'd suggest you to lighten up.
They hate you because you speak the truth. Code monkeys calling themselves engineers really is funny.
"silicon valley people would benefit from talking to people who build chips", that's a good one!
It would be funny, if it wasn't also sad to see the decline.
Do you have an engineering degree ?
Yes, a bachelors and a masters.
Not that the degree means much, I learnt 90% of what I know on the job. It certainly helped get my foot in through the university brand, and alumni network.
You can call yourself anything you want Doctor, Lawyer, Engineer. I have the freedom to think my own thoughts too.
I always likened "engineers"[1] to "people who are proficient in calculus"; and "computers"[1] to "people who are proficient at calculations".
There was brief sidestep from late 1980s to early 2010s (~2012) where the term "software engineer" came into vogue and completely ran orthogonal to "proficiency in calculus". I mean, literally 99% of software engineers never learned calculus!
But it's nice to see that ever since ~2015 or so (and perhaps even going forward) proficiency in calculus is rising to the fore. We call those "software engineers" "ML Engineers" nowadays, ehh fine by me. And all those "computers" are not people anymore -- looks like carefully arranged sand (silicon) in metal took over.
I wonder if it's just a matter of time before the carefully-arranged-sand-in-metal form factor will take over the "engineer" role too. One of those Tesla/Figure robots becomes "proficient at calculus" and "proficient at calculations" better than "people".
Reference: [1]: I took the terms "engineer" and "computer" literally out of the movie "Hidden Figures" https://en.wikipedia.org/wiki/Hidden_Figures#Plot
It looks like ever since humankind learned calculus there was an enormous benefit to applying it in the engineering of rockets, aeroplanes, bridges, houses, and eventually "the careful arrangement of sand (silicon)". Literally every one of those jobs required learning calculus at school and applying calculus at work.
Why pointing out Calculus as opposed to just Math?
Might be just my Eastern Europe background where it was all just "Math" and both equations (that's Algebra I guess) and simpler functions/analysis (Calculus?) are taught in elementary school around age 14 or 15.
Maybe I'm missing/forgetting something - I think I used Calculus more during electrical engineering than for computer/software engineering.
In my central european university we've learned "Real Analysis" that was way more concerned about theorems and proofs rather than "calculating" something - if anything, actually calculating derivatives or integrals was a warmup problem to the meat of the subject.
Calculus, because all of engineering depends critically on the modeling of real world phenomena using ordinary or partial differential equations.
I don’t mean to disregard other branches of math — of course they’re useful — but calculus stands out in specific _applicability_ to engineering.
Literally every single branch of engineering. All o then. Petrochemical engineering to Biotech. They all use calculus as a fundamental block of study.
Discovering new drugs using Pk/Pd modeling is driven by modeling then drug<->pathogen repo as cycles using Lotka models.
Im not saying engineers dont need to learn stats or arithmetic. IMO those are more fundamental to _all_ fields, janitors or physicians or any field really. But calculus is fundamental to engineering alone.
Perhaps, a begrudging exception I can make is its applications in Finance.
But every other field where people build rockets, cars, airplanes, drugs, or ai robots, you’d need proficiency in calculus just as much as you’d need proficiency in writing or proficiency in arithmetic.
True, we learnt calculus before college in my home country - but it was just basic stuff. But I learnt a lot more of it including partial derivatives in first year of engineering college.
>I think I used Calculus more during electrical engineering than for computer/software engineering.
I think that was OPs point - most engineering disciplines teach it.
Yeah computer science went through this weird offshoot for 30-40 years where calculus was simply taught because of tradition.
It was not really necessary through all of the app developers eras. In fact, it’s so much so the case that many software engineers graduating from 2000-2015 or so work as software engineers without a degree in BS. Rather, they could drop the physics & calculus grind and opt for a BA in computer science. They then went on to become proficient software engineers in the industry.
It’s only after the recent advances of AI around 2012/2015 did a proficiency in calculus become crucial to software engineering again.
I mean, there’s a whole rabbit hole of knowledge on the reason why ML frameworks deal with calculating vector-Jacobian or Jacobian-vector products. Appreciating that and their relation to gradient is necessary to design & debug frameworks like PyTorch or MLX.
Sure, I will concede that a sans-calculus training (BA in Computer Science) can still be sufficiently useful to working as an ML engineer in data analytics, api/services/framework design, infrastructure, systems engineering, and perhaps even inference engineering. But I bet all those people will need to be proficient in calculus the more they have to deal with debugging models.
That 99% guess seems high considering calculus is generally a required subject when studying computer science (or software engineering) at most universities I know of.
In mine it was mandarory, there were 9 + 9 + 4.5 credits of only calculus itself. There was way more: discrete math, algebra...
You’re right it’s a total guess. It’s based on my experience in the field.
My strong “opinion” here comes from an observation that while calculus may have been a required subject of study in awarding engineering degrees, the reality is, people didn’t really study it. They just brushed through a couple of pages and wrote a few tests/exams.
In America there’s plethora of expert software engineers who opt for a bachelors degree in computer science that is a BA not a BS.
I think that’s complete totally reasonable thing to do if you don’t want to grind out the physics and calculus courses. They are super hard after all. And let’s face it, all of the _useful to humanity_ work in software doesn’t require expertise in physics or calculus, at least until now.
With AI going forward it’s hard to say. If more of the jobs shift over to model building then yes perhaps a back to basics approach of calculus proficiency could be required.
Most software engineering just doesn’t require calculus, though it does benefit from having the understanding of functions and limit behaviors that higher math does. But if you look at a lot of meme dev jobs they’ve transitioned heavily away from the crypto craze of the past 5 years towards “prompt engineering” or the like to exploit LLMs in the same way that the “Uber for X” meme of 2012-2017 exploited surface level knowledge of JS or API integration work. Fundamentally, the tech ecosystem desires low skill employees, LLMs are a new frontier in doing a lot with a little in terms of deep technical knowledge.
Hmm, that is an interesting take. Calculus does seems like the uniting factor.
I've come to appreciate the fact that domain knowledge has a more dominant role in solving a problem than technical/programming knowledge. I often wonder how s/w could align with other engineering practices in terms of approach design in a standardized way so we can just churn out code w/o an excessive reliance on quality assurance. I'm really hoping visual programming is going to be the savior here. It might allow SMEs and Domain experts to utilize a visual interface to implement their ideas.
Its interesting how python dominated C/C++ in the case of the NumPy community. One would have assumed C/C++ to be a more a natural fit for performance oriented code. But the domain knowledge overpowered technical knowledge and eventually people started asking funny questions like
https://stackoverflow.com/questions/41365723/why-is-my-pytho...
I agree a hundred percent that domain knowledge is the single most dominant influence to problem solving expertise.
there was some old commercial that had the tagline "performance is nothing without control". If you can't put the technology to work on your problems then the technology, no matter how incredible, is worthless to you.
This checks out. I'm a software developer who took math all through high school and my first three years of college. I barely scraped through my calculus exams, but I excelled at combinatorics, probability, matrix math, etc. (as long as it didn't veer into calculus for some reason).
I guess I just enjoy things more when I can count them.
For this engineering, I think calculus is not the main proficiency enhancer you claim it to be. Linear Algebra, combinatorics, probability and number theory are more relevant.
Calculus was important during the world wars because it means we could throw shells to the enemy army better, and that was an important issue during that period.
Nowadays, calculus is just a stepping stone to more relevant mathematics.
Calculus has never gone out of style ;)
Todays ML frameworks grapple with the problem of “jacobian-vector products” & “vector-jacobian product” as a consequence of understanding the interplay between gradients & derivative; and the application of the “chain rule”. All of those 3 concepts are fundamentally understood by being proficient in calculus.
While I’m being the hype-man for calculus I don’t mean to say proficiency in linear algebra or statistics is in any “less necessary” or “less useful” or “less challenging” or “less..” in any way.
I’m merely stating that, historically, calculus has been the unique branch of study for engineering. Statistics has always found value in many fields — business, finance, government policy etc.
Sure Linear algebra is one of those unique fields too — I kinda like to think of it as “algebra” in general and perhaps its utility has flowed in tandem with calculus. Idk. I haven’t thought super hard about it.
Calculus is continuous, analog math. Digital Computers use discrete math.
Both are math, and both are still incredibly important. Rockets haven't gone out of style.
Aren’t you supposed to learn calculus to be able to understand what O(n) even is? Is it not a standard part of a CS major?
You have precisely captured why I got interested in AI.
They also drive trains
From what I've heard (not an OSX user) Windows is the best operating system for multiple screens; OSX and Linux glitch way more. Most anyone doing 3D sculpture or graphics/art on a professional level will eventually move to working with 2-3 screens, and since there are no exclusively Mac design programs, OSX will be suboptimal.
There's little things too, like some people using gaming peripherals (multi-button MMO mice and left hand controllers, etc.) for editing, which might not be compatible with OSX.
And also, if you're mucking around with two 32 inch 4k monitors and a 16 inch Wacom it might start to feel a little ridiculous trying to save space with a Mac Pro.
Besides Windows having more drivers for USB adapters than Linux*, which is a reflection of the market, I find Linux having much fewer glitches using multiple screens.
Once it works, Linux is more reliable than Windows. And virtual desktops have always worked better on Linux than on Windows. So I disagree with you on that front.
* In my case, this means I had to get an Anker HDMI adapter, instead of any random brand.
>I find Linux having much fewer glitches using multiple screens.
Maybe as long as you don't need working fractional scaling with different DPI monitors, which is nothing fancy now.
Nitpick: it hasn’t been called “OS X” for almost eight years now, starting with macOS Sierra.
I’ve been doing art on a pro level for twenty five years and I dislike multiple monitors.
I am just commenting about what I've seen at concept artist desks / animation studios / etc.
Why is that?
I am not an artist and also dislike multiple monitors, though I will employ two of them on occasion.
My reasons are:
If the window and application management doesn't suck, one display is all one needs.
With cheap multiple displays and touch devices came an ongoing enshitification of app and window management. (And usually dumber focus rules)
Having to turn my head x times a day sucks.
Who do you think writes those CAD and design tools that help “actual engineers” solve the same problems over and over?
Would you like me to explain how it works to you? I'm not sure why you added a question mark.
Yes, they were asking you a question. Do you not understand question marks?
I'd say a lot of engineers (bridges, circuit boards, injection mouldings) are kept far away from OSX (and linux). Honestly, I'd just love a operating system that doesn't decide its going to restart itself periodically!
> Honestly, I'd just love a operating system that doesn't decide its going to restart itself periodically!
My MBP has been running without any restart for over a month.
Yes. I'm pretty sure my wifes 2014 Macbook Air has been 6 months without restart. My windows 11 workstation however has never done a week. I power down now daily to avoid dissapointment.
> who never standardize anything
IETF RFCs soon number over 10K; Java, win32, the Linux kernel syscall API are famous for backward compatibility
not to mention the absurd success of standard libraries of Python, Rust, PHP and certain "standard" projects like Django, React, and ExpressJS
> (how many libraries do you need for arrays and vectors and guis and buttons and text boxes and binary trees and sorting, yada yada?)
considering the design space is enormous and the tradeoffs are not trivial ... it's good to have libraries that fundamentally solve the similar thing but in different context-dependent ways
arguably we are using too many libraries and not enough problem-specific in-situ DSLs (see the result of Alan Kay's research the STEPS project at VPRI - https://news.ycombinator.com/item?id=32966987 )
I'd argue almost all NEW library development is about politics and platform ownership. Every large company wants to be the dependency that other projects tie into. And if you don't want to hitch your wagon to google or facebook or whoever, you roll your own.
Many if not most computational problems are fundamentally about data and data transformation under constraints - Throughput, Memory, Latency, etc, etc. And for the situations where the tradeoffs are non-trivial, solving this problem is purely about domain knowledge regarding the nature of the data (video codec data, real-time sensor data, financial data, etc) not about programming expertise.
The various ways to high level architect the overall design in terms of client/server, P2P, distributed vs local, threading model, are, IME are not what I would call crazy complicated. There are standard ways of implementing various variations of the overall design which sadly because of a overall roll-your-own mindset, most devs are reluctant to adopt someone elses design. Part of that is that we don't have a framework of knowledge that allows us to build a library for these designs in our head where we can just pick one thats right for our usecase.
I don't agree with your characterization of the design space as 'enourmous'. I'd say most programmers just need to know a handful of design types because they're not working on high performance, low latency, multi-million endpoint scalable projects where as you say things can get non-trivial.
I'll give a shot at an analogy (I'm hoping the nitpickers are out to lunch). The design space for door knob is enormous because of the various hand shapes, disability constraints, door sizes, applications, security implications, etc. And yet we've standardize d on a few door knob types for most homes which you can go out and buy and install yourself. The special case bank vaults and prisons and other domains solve it their own way.
I challenge you to take those people who make bridges to build full software.
I am not meaning software is engineering or not.
It is a fact, in terms of cost, that software and bridge building are, most of the time very different activities with very different goals and cost-benefit ratios.
All those things count when taking decisions about the level of standardization.
About standards... there are lots also and widely used, from networking to protocols, data transfer formats... with well-known strengths and limitations.
In my 30± year career I can confidently say that Software Engineers look towards standardisation by default as it makes their lives easier.
It feels to me that you're bitter or had more than one bad experience. Perhaps you keep working with, or come across, bad Engineers as your generalising is inaccurate.
Maybe we need a new moniker "webgineer". The average HN/FAANG web programmer does appear to vastly overestimate the value of their contributions to the world.
Have we done full circle?
When I started doing this "Internet stuff" we were called "webmasters", and job would actually include what today we call: - DevOps - Server/Linux sysadmin - DB admin - Full stack (backend and frontend) engineer
And I might have forgot some things.
1999 indeed! I haven't heard that term since around 1999 when I was hired as a "web engineer" and derisively referred to myself as a "webgineer". I almost asked if I could change my title to "sciencematician".
People who cobble together new printers or kettles overestimate the value of their contributions to the world too. The delineation isn't between JS devs and JPL or ASML engineers.
You can shit all you want on so called "engineers", but they are the one who make the CAD you're talking about that "real engineers" use. So get off your high horse.
You're kidding yourself if you don't think that mechanical, structural or any other engineers don't do the same thing. They do.
I worked for one of the UKs leading architecture / construction firms writing software and also am an amature mechanic.
You'd be amazed at how many gasket types, nuts, bolts, fasteners, unfasters, glues, concretes, bonding agents and so on ... all invented for edge preferences and most of which could be used interchangably.
Also standards? Hah. They're an absolute shitshow in any engineering effort.
I mean ... even just units of measure. C'mon.
And they can typically setup their dev environment without a VM, while also getting commercial app support if they need it.
Windows requires a VM, like WSL, for a lot of people, and Linux lacks commercial support. macOS strikes a good balance in the middle that makes it a pretty compelling choice.
There are a plethora of companies offering commercial support for various Linux distributions.
I was thinking more about software like the Adobe suite, Microsoft Office, or other closed source software that hasn’t released on Linux. Electron has made things a bit better, but there are still a lot of bigs gaps for the enterprise, unless the company is specifically choosing software to maintain Linux support for end users.
Sure, Wine exists, but it’s not something I’d want to rely on for a business when there are alternatives like macOS which will offer native support.
Most people don't need the Adobe Suite, and the web version of M$-Office is more than Ok for occasional use. Most other enterprise software are web apps too nowadays, so it's much less relevant what OS your machine is running than it was ten years ago...
Excel is essential and in most businesses that I worked with, most of the accounting and business side is run on it. I switched to Windows from Linux just because of Excel when WSL came out. If Linux would have Excel and Photoshop that would be a no brainer to choose it, but that will never happen
Yep, that's pretty much it.
Apple fanboys like to talk about how cool and long lasting a MacBook Air is but a 500 bucks Chromebook will do just as well while allowing pretty much 90% of the use cases. Sure, the top end power is much lower but at the same time considering the base RAM/storage combo Apple gives it is not that relevant. If you starting loading it up, that puts the pricing in an entirely different category and in my opinion the MacBook Air becomes seriously irrelevant when compared to serious computing devices in the same price range...
There's still a huge market for people who want higher end hardware and to run workloads locally, or put a higher price on privacy. For people who want to keep their data close to their chest, and particularly now with the AI bloom, being able to perform all tasks on device is more valuable than ever.
A Chromebook "does the job" but it's closer to a thin client than a workstation. A lot of the job is done remotely and you may not want that.
Yes, but for those people if you consider the price of a fully loaded MacBook Pro it is a rather small win considering all the limitations.
If the only thing you care about are battery life (only if you plan to use it lightly on the go, because even the high-end Apple Silicon sucks decent amount of power at full tilt) and privacy I guess they are decent enough.
This is my argument: the base models are at the same time overkill and too limited considering the price and the high-end models are way too expensive for what they bring to the table.
Apple has a big relevancy problem because of how they put a stupid pricing ladder on everything, but that is just my opinion, I guess. As long as they keep making a shit ton of cash it doesn't matter, I suppose. But if the relevant people stop buying Macs because they make no sense, it will become apparent why it matters sooner or later...
Not at all, a Chromebook let's you run Linux apps. I can run full blown IDEs locally without problems. And yes, that is with 8Gb ram, ChromeOS has superb memory management.
Since the full blown IDE is running in a Linux VM, don't you mean, "Linux has superb memory management"?
Well, Google developed and deployed MGLRU to Chromebooks long before upstreamed it. Plus they use some magic to check the MGLRU working set size inside the VMs and balance everything.
Now I see. Interesting. (I'm planning to switch to ChromeOS, BTW.)
what chromebooks come with a mini LED HDR screen and insane battery life? i’d love to know
Are you seriously arguing about mini-LED displays only found in expensive MacBook Pro when I mention a cheap 500 dollars Chromebook. There is at least a 4x difference in price for those machines, it is ridiculous to even pretend they are somewhat comparable.
And if we are talking about expensive high-end hardware, mini-LED is worse than OLED found in those machines anyway so it's not like if that would be a strong argument.
yes, because i find the argument that a chromebook coming close to a MacBook to be extremely disingenuous.
no, i’m a big fan of OLED, but for a laptop, especially a non gaming one, the extra brightness and lack of burn in concern makes it better.
My argument isn't about Chromebooks vs any MacBook. My argument is against a base MacBook Air that is too expensive for relatively limited added utility against something like a cheaper Chromebooks.
Sure, the MacBook Air is better built and will do some things better but those things are not extremely relevant for someone who would be satisfied by an entry level MacBook Air, because while an MBA has some very nice attributes, in the end everything is limited by its RAM/storage (and to a lesser degree, ports).
For a concrete example, in my country the cheaper MacBook Air you can get is the old M1 design at 1k€, then there is the M2 refresh at 1.2k€ and M3 variant at 1.3k€.
But you can get an Asus Chromebook Plus for 600€ that has either the same amount of RAM and storage or more RAM (16Gb) or more storage (512Gb) depending on the variant you end up with. The display is worse (100 nits less bright and worse resolution) but slightly bigger (14") and that may matter more to many people. It has an older Intel i5 (you can find some AMD options for better efficiency) but it hardly matters for the vast majority of people who just want a laptop to do the basics (basically the target of a MacBook Air). Its battery life would be a bit worse than an MBA but not in a way that can be relevant for the vast majority of customers. One major advantage it has over an MBA is better ports selection, with an HDMI port, a USB A port and an SD card reader on top of the 2 Thunderbolt/USB C ports the MBA has, allowing a dongle free life without having to buy anything else, providing much better utility. That can be way more relevant for many peoples than a better build quality (that I would argue do not even bring better longevity, since with Apple you are hostage of the software support anyway).
You see I am not against MacBooks; in fact, I would advise purchasing a MacBook Pro for some specific use case.
But the reality is that the entry level Apple hardware is rather compromised for its price, and if someone would be satisfied by that choice, I'm arguing that there is another choice (worse on paper, better in some ways) but at half the price (40% off minimum).
If you start loading up a MacBook Air, you end up in MacBook Pro price territory and it doesn't make a lot of sense to not add the 100-200 more to get the much better machine.
I know from experience that entry level Apple hardware is a terrible deal, both because I made the mistake myself or I had to help/fix the issues for other people that made those choices. I have a cousin who remind me every time how much he fucking hates his entry level iMac (an old one with a compromised Fusion Drive and minimum RAM) even though it was rather expensive compared to most computers. My answer is always the same: you spent a lot, but not enough, because with Apple you do not deserve a good experience if you don't go above and beyond in your spending.
In my opinion it is way more disingenuous to advocate for entry-level Apple hardware to people who would be very much satisfied with products costing half as much. The value is just not there, Apple went way too far in the luxury territory in locking down everything and creating a pricing ladder that is engineered to confuse and upsell customers to extract as much money as possible.
For someone who really needs a computer to do more intense work, provided they can work around the tradeoffs of Apple Silicon/macOS and they are willing to spend a large amount of cash, Apple has some great hardware for sure. For everyone else the value is extremely questionnable, especially since they are going full steam ahead into services subscription and the software can be lacking in some ways that will require purchasing even more stuff, the total cost of ownership doesn't make sense anymore. For example, their iPhone SE is absolutely terrible, at 500€ you pay for 6 years old technology with small screen compared to the footprint, terrible battery life, etc. A 500€ mid-range Android is so much better in so many ways that it is just stupid at this point...
As for OLED, I don't think burn-in is a significant concern anymore, and if it is I would argue that you are using it too much like a desktop. In my country you could buy 2 decent OLED laptops for the price of an entry-level MacBook Pro anyway so it doesn't matter as much (and replacing displays of hardware manufacturers other than Apple is much easier and cheaper, so there is that). I think the MacBook Pros are very good for some niche applications, but at viable minimum 2.23k€ price (16Gb RAM/512GB storage) there are a lot of good options so it really requires a careful analysis of actual use case. If you do things related to 3D or engineering it is probably not worth it...
No mini LED, but you can configure the HP Elite Dragonfly Chromebook with a 1000 nits IPS display.
And AFAIK, Google dictates 10+h of battery life with mixed web browsing for all Chromebooks.
1000 nits is useless without HDR.
I'm pretty sure it does HDR too.
without mini LED / precise backlight control though, it’s useless
and backlight control
You usually don't need either for software development though, and if you do the free or online alternatives are often good enough for the rare occasions you need them. If you are a software developer and you have to spend significant time using Office it means you either are developing extensions for Office or your company management is somewhat lacking and you are forced to handle things you should not (like bureaucracy for instance).
Where I’m at my email is in Outlook. Having to use the web version sounds annoying. I also end up getting a lot of information in spreadsheets. Having to move all that to the online version to open also sounds annoying. The online version is also more limited, which could lead to issues.
I could see a front end dev needing Photoshop for some things, if they don’t have a design team to give them assets.
There are also security software the company says laptops must have which isn’t available for Linux. They only buy and deploy this stuff with Windows and macOS in mind.
A couple weeks ago on HN I saw someone looking for a program to make a demo of their app (I think). The comments were filled with people recommending an app on macOS that was apparently far and away the best option, and many were disappointed by the lack of availability elsewhere. I find there are a lot of situations like this, where I might be able to get the job done on another OS, but the software I actually want to use is on macOS. Obviously this one is a matter of taste to some degree.
It’s not as big an issue as it was 20 years ago, but it’s still an issue for in many environments.
I use Linux for work with the MS apps used in a browser. I use one specific app using a remote desktop .. also using a browser.
So this can be done. I don't expect the IT support to help me with any Linux issues.
My excuse for using Linux? It makes me more effective at developing software.
If you mean WSL for containers, macOS needs a VM too. If youre doing C++ macOS dev tools are .. bleak. Great for webdev though
↑ This!
I would love to buy Apple hardware, but not from Apple. I mean: M2 13 inch notebook with access to swap/extend memory and storage, regular US keyboard layout and proper desktop Linux (Debian, Alpine, Mint, PopOS!, Fedora Cinamon) or windows. MacOS and the Apple eco system just gets in your way when you're just trying to maintain a multi-platform C++/Java/Rust code base.
WSL for normal stuff. My co-worker is on Windows and had to setup WSL to get a linter working with VS Code. It took him a week to get it working the first time, and it breaks periodically, so he needs to do it all over again every few months.
I'm developing on Windows for Windows, Linux, Android, and web, including C, Go, Java, TSQL and MSSQL management. I do not necessarily need WSL except for C. SSH is built directly into the Windows terminal and is fully scriptable in PS.
WSL is also nice for Bash scripting, but it's not necessary.
It is a check box in the "Add Features" panel. There is nothing to install or setup. Certainly not for linting, unless, again, you're using a Linux tool chain.
But if you are, just check the box. No setup beyond VS Code, bashrc, vimrc, and your tool chain. Same as you would do on Mac.
If anything, all the Mac specific quirks make setting up the Linux tool chains much harder. At least on WSL the entire directory structure matches Linux out of the box. The tool chains just work.
While some of the documentation is in its infancy, the workflow and versatility of cross platform development on Windows, I think, is unmatched.
This. I have to onboard a lot of students to our analysis toolchain (Nuclear Physics, ROOT based, C++). 10 years ago I prayed that the student has a Mac, because it was so easy. Now I pray they have Windows, because of WSL. The tool chain is all compiled from source. Pretty much every major version, but often also minor versions, of macos break the compilation of ROOT. I had several release upgrades of Ubuntu that only required a recompile, if that, and it always worked.
Unless he is doing Linux development in the first place, that sounds very weird. You most certainly don't need to set up WSL to lint Python or say JS in VSCode on Windows.
That sounds wild, you can run bash and unix utils on windows with minimal fuss without WSL. Unless that linter truly needed linux (and i mean, vscode extensions are typescript..) that sounds like overkill
Don't you need Cygwin or Git Bash if you don't use WSL? That's kind of fussy.
As Windows/UNIX developer, I only use WSL for Linux containers.
What do you mean without a VM? I guess you don't count docker/podman as VMs then?
Likely that most devs want to use Unix tools — terminal, etc.
Those aren't VMs -- they're containers.
Only on Linux - on MacOS and Windows, you have to do virtualization for containers.
Unless you do use WSL1 as your container runner. Nobody does this but I do not get why.
I hardly can imagine how it works for you, because WSL1 basically lacks any option for containers be it namespaces or cgroups? Netfilter? Bridges?
Feel free to correct me and share successfull cases with wsl1 and containers
WSL is not a VM. Edit: TIL WSL2 is a VM. I develop on mac and linux computers so should have kept my mouth shut anyways
Hey, you learned and corrected yourself, dont be so hard on yourself mate.
Seriously! I agree. They just modeled the best discussion with some of the highest value there is.
Being wrong is no big deal. Being unable to be right is often a very big deal.
Username highly inaccurate ;)
Downloaded for complimenting GP?
Stay classy, Hacker News
Just to make sure your TIL is complete, do note that Linux containers are VMs also on MacOS :)
> Engineers use MacBook pros because it’s the best built laptop, the best screen, arguably the best OS and most importantly - they’re not the ones paying for them.
I know engineers from a FANG that picked MacBook pros in spite of the specs and only because of the bling/price tag. Them they spent their whole time using it as a remote terminal for Linux servers, and they still complained about the thing being extremely short on RAM and HD.
One of them even tried to convince their managers to give the vision pro a try, even though there was zero use cases for it.
Granted, they drive multiple monitors well with a single USB-C plug, at least with specific combinations of monitors and hubs.
It's high time that the "Apple sells high end gear" shtick is put to rest. Even their macOS treadmill is becoming tiring.
The build quality of Apple laptops is still pretty unmatched in every price category.
Yes, there are 2k+ laptops from Dell/Lenovo that match and exceed a similarly priced MacBook in pure power, but usually lack battery life and/or build quality.
Apple devices also work quite seamless together. IPads for example work great as a second screen wirelessly with the MBPs. I'd immediately buy a 14 inch ipad just for that, since that is so useful when not on your standard desk. Also copy paste between devices or headphones just work...
in case Apple would come up with the idea to take an ipad as external compute unit that would be amazing... just double your ram, compute and screen with it in such a lightweight form factor... should be possible if they want
You can use the iPad as a second monitor on Windows too and it works nicely. I also use my airpod pro's with my Dell Windows XPS and it's perfect.
is there now a low latency solution for windows 2nd monitor? I was only aware of some software where latency is quite bad or one company that provided a wireless HDMI / Displayport dongle...
Also the nice thing for headphones within apple is, that the airpods automatically switch to where the attention is... meaning e.g., in case I watch something on the laptop and pick up an iphone call (no matter if via phone or any app) the airpod automatically switches
My 15 inch macbook which fried its display twice (didn't go to sleep properly and then put in a bagpack and overheated. There is no way to see that the sleep didn't kick in), and then had the broken display cable problem (widespread and Apple wanted $900 for a new display..) would disagree. For comparison: The 4k touch display on my xps15 that didn't survive a diet coke bath was <$300 including labor for a guy to show up in my office and repair it while I was watching....
> The build quality of Apple laptops is still pretty unmatched in every price category.
I owned a MacBook Pro with the dreaded butterfly keyboard. It was shit.
How many USB ports do the new MacBook air have? The old ones had two. And shipped with 8GB of RAM? These are shit-tier specs.
The 2020 MacBook pros had a nice thing: USB-C charging, and you could charge the from either side. Current models went back to MagSafe, only on one side. The number of USB ports is still very low.
But the are shiny. I guess that counts as quality.
I guess we can agree to disagree, but I find the 2020 rev Macbook pros have a good number of USB-C ports (2 on the left, 1 on the right -- all can do PD), a magsafe charger, headphone jack, HDMI port and SD card slot. How many USB-C ports do you need? Sometimes I wish there was ethernet but I get why it's not there.
I agree, the butterfly keyboard was shitty but I absolutely love the keyboard on the 2020 rev. It's still not as great as my mechanical desktop keyboard, but for a laptop keyboard it's serious chef's kiss. Also, I have yet to find a trackpad that is anywhere as good as the Macbook. Precision trackpads are still way way worse.
Finally, the thing that always beings me back to MBPs (vs Surfacebooks or Razers) is battery life. I typically get a good 10+ hours on my MBP. Battery life on my old Razer Blade and Surfacebooks were absolutely comically horrible.
I'm absolutely not an Apple person. Privately own zero Apple hardware.
However there are two awesome things about my work MBP I would really want from my ThinkPad:
Magsafe charger - too many close calls!
And the track pad.
I can't work properly without an external mouse on my ThinkPad. But on the MBP everything just has the right size, location, proportions and handling on the track pad. I had a mouse for the MBP too but I stopped using it!
USB-C charging still works with the Pros (driving a M3 Max), and 3 ports seems reasonable to me.
> I owned a MacBook Pro with the dreaded butterfly keyboard. It was shit.
Yea, the butterfly was utter shit. And they fucked up the touchbar by not just putting it on TOP of the existing F-keys.
But the rest of the laptop was still well machined :D
the more the deviate from the BSD core the worse it gets.
But I can still fire up a terminal and use all of my *nix skills to operate.
I can't do that on Windows without either wrestling with PowerShell or WSL2
I don’t think it’s at all unreasonable for an engineer using a device for 8+ hours every day to pay an additional, say, 0.5% of their income (assuming very conservatively $100,000 income after tax, $1,000 extra for a MacBook, 2 year product lifespan) for the best built laptop, best screen, and best OS.
$100,000 after tax does not seem conservative to me (at least outside the US).
$50,000 income, 4 year product lifespan?
Obviously doesn’t apply to all engineers.
> and best OS
I do networking stuff and macOS is on par with Windows - I can't live on it without running into bugs or very questionable behavior for longer than a week. Same as Windows.
What stuff is weird? I have so far had very good experiences with Apple (although not iOS yet). Almost everything I do on my Linux workstation works on Mac too. Windows though is beyond horrible and different in every way.
> I do networking stuff
Me too, but probably very different stuff. I’m doing p2p stuff over tcp and am affected mostly by sock options, buffer sizes, tcp options etc.
> Best OS
I like apple hardware, but their OS is fucking atrocious. In the year 2024 it still doesn't have a native volume mixer, or any kind of sensible window management shortcuts. Half the things on it have to be fixed with paid software. Complete joke of an OS, if it were up to me I'd stick a linux distro on top of the hardware and be happy
The OS is not a joke since it can do some stuff better than either Windows or Linux can but I completely agree that there are some serious limitations or omissions that should have been fixed.
I think they don't because they have an incentive to not do so: they get a cut on all the software you have to purchase on the App Store to make up for it. It might not look like a lot, but if a large portions of Mac users need to buy a 5-10 bucks app to fix the windows management problems, it becomes serious money at 15-30% cut on millions of purchases...
And this is precisely the problem with Apple today. They are not honest enough to fix or improve the stuff they sell at a very high price, both because they sell it anyway and because they put in place many incentives for themselves to not do so.
There is the running meme of the iPad calculator but macOS could also use some care on the calculator/grapher not having received serious attention in decades. At the price they sell their stuff that would seem like a given, but considering they'll make money on the apps you buy to improve that situation, they'll never do so.
After using Apple App Stores for so many years, I wish I didn't, the convenience really isn't worth the cost down the road...
And the M1 chip on mine really alters productivity. Every time we want to update a library, we need some kind of workaround.
It's great having a chip that is so much different than what our production infrastructure uses.
This should be a temporary problem solved with time. The battery and performance gains are completely worth most workarounds required.
Not worth it at all. I rarely use battery power, so I'd rather have an intel or AMD chip with more cores and a higher clock speed at the expense of the battery. Oh, and an OS that can actually manage its windows, and customize keyboard settings, and not require an account to use the app store
Then why are you using a Macbook in the first place? There are plenty of Ryzen 7000 and Intel Ultra laptops with similar performance out there. The key benefit of a Macbook is the battery life and sane sleeping when portable.
Tell that to my employer
[dead]
Apple's hardware these days is exceptional but the software left wanting in comparison. MacOS feels like it's been taking two steps back for every step forward for a decade now. I run MacOS, Linux w/ i3, and Windows all every day, and outside of aesthetics & apple integration MacOS feels increasingly the least coherent of the 3.
The same is true of the ipad which is just a miraculous piece of hardware constrained by an impotent operating system.
This statement is completely wrong. There are millions of engineers in the world and most of them live in countries like China, India and Russia. Very few of them use MacBooks.
The vast majority of the software engineers in big companies (that employ a lot more people than big tech and startups combined) who use Java and C# also have predominately Windows laptops (as their employers can manage Windows laptops a lot easier, have agreements with vendors like Dell to buy them with a discount, have software like AV that doesn't support MacOS, etc.).
On top of that MacBooks don't have the best screens and are not the best built. Many Windows laptops have OLED screens or 4K IPS screens. There are premium Windows laptops made out of magnesium and carbon fiber.
I'm an American, so maybe the situation is different elsewhere.
Every company I've worked for during the last 12 years gives out MacBook Pros. And I've been developing using Scala / Java for the last 20 years.
Employers manage Macs just fine, this isn't 1999. There have been studies showing that Macs have lower IT maintenance costs compared to Windows.
I admit that I haven't dealt with Windows devices in a long time, maybe there are some good ones available now, but I find your statements to be beyond belief. Apple Silicon Macs have blown the doors off the competition, out performing all but top-end Intel laptops, while using a fraction of the power (and I never even hear the fans come on).
I think relatively few corporations are offering Macs to people. It's all bog-standard POS Dells, with locked-down Windows images that often do not even allow you to change the screensaver settings or the background image, in the name of "security." I'd love to be wrong about that.
all two jobs I've worked, both as a backend dev using Go in data-storage companies, have offered Macs. The first one, a small, badly run startup, only offered Macs. This gig, a larger company, offers Mac, Linux and Windows. I started with Linux and then switched to Mac because I was tired of stuff breaking.
I use Debian + FreeBSD on my personal PCs though.
US Engineers, and in countries of similar income, the rest of the world is pretty much settled in a mix of Windows and GNU/Linux desktop/laptops.
If it weren't for the OS I would've bought a MacBook instead of a Lenovo laptop.
I've set up my OS exactly as I want it. (I use arch btw ;-))
Arch works fairly well on Apple silicon now, though Fedora is easier/recomended. Limited emulation due to the 16KB pages and no thunderbolt display out.
Same, but on gentoo :-p
I’m freelance so I’ve absolutely paid for my last 3 Macbooks. They’re best in class tools and assets for my business.
Arguably the best OS? For what? For browsing the web, video editing, etc.? Maybe. For development? Jesus, macOS doesn't even have native container support. All the devs I know with macOS then either get a second Linux laptop, or spend a lot of their time SSHd into a Linux server.
For dev (at least backend and devops), macOS is not that great.
I don't know what you are talking about, I'm a back end engineer, and every company I've worked for during the last 12 years gives out MacBook pros to all devs. Even the game company that used C# and Mono gave out MacBooks (and dual booted them, which of course you can't do any more; I never bothered with Windows since our servers were written in Scala).
Not all teams run tons of containers on personal computers. All our servers are running on AWS. I rarely ssh into anything.
I like the fact that OS X is based on UNIX, and not some half-assed bullshit bolted onto Windows. I still have bad memories of trying to use Cygwin 15 years ago. Apparently WSL is an improvement, but I don't care.
Mac runs all the software I need, and it has real UNIX shells.
Yeah it's funny for all the hoopla I've heard over the glory of MacOS having a REAL UNIX TERMINAL, WSL works better in practice simply because it's running an actual Linux VM and thus the support is better.
Still, I just don't think it's that burdensome to get containers running on MacOS, it's just annoying that it happens to work worse than on Windows or Linux. Ignoring the hardware, the only real advantage to MacOS development is when you're targeting Apple products with what you're developing.
"best OS" is so subjective here. I'll concede that the MacBook hardware is objectively better than any laptop I've owned. But it's a huge leap to say Mac OS is objectively better than Linux IMO.
They are perhaps only the best by a very small margin.
I am happy to not support Apple's ecosystem and use a minimally worse laptop from a different brand.
I have one and hate it with a passion. A MacBook Air bought new in the past 3 years should be able to use Teams (alone) without keeling over. Takes over a minute to launch Outlook.
My 15 year old Sony laptop can do better.
Even if Microsoft on Mac is an unmitigated dumpster fire, this is ridiculous.
I avoid using it whenever possible. If people email me, it’d better not be urgent.
I avoid using Outlook on any device, but I wouldn't complain about my Surface tablet's performance based on how poorly iTunes performs...
Meanwhile here I am, running linux distros and XFCE on everything. My hardware could be a decade old and I probably wouldn't notice.
(In fact I DO have a spare 13 year old laptop hanging around that still gets used for web browsing, mail and stuff. It is not slow.)
Indeed, I have a 15-year-old desktop computer that is still running great on Linux. I upgraded the RAM to the maximum supported by the motherboard, which is 8 GB, and it has gone through three hard drives in its life, but otherwise it is pretty much the same. As a basic web browsing computer, and for light games, it is fantastic.
It also performs pretty well for the particular brand of web development I do, which basically boils down to running VS Code, a browser, and a lot of ssh.
It's fascinating to me how people are still attached to the hardware upgrade cycle as an idea that matters, and yet for a huge chunk of people and scenarios, basically an SSD, 8gb of RAM and an Intel i5 from a decade ago could have been the end of computing history with no real loss to productivity.
I honestly look at people who use Apple or Windows with a bit of pity, because those ecosystems would just give me more stuff to worry about.
Is it an Apple silicon or Intel machine? Intel macs are crazy slow - especially since the most recent few versions of macOS. And especially since developers everywhere have upgraded to an M1 or better.
No MacBook Air from the last 3 years is Intel-based
You could certainly still buy new intel macbooks 3 years ago from Apple. Plenty of people did - particularly given a lot of software was still running through rosetta at the time.
The M1 air was only released in November 2020. With a bit of slop in the numbers, its very possible the parent poster bought an intel mac just before the M1 launched.
Yeah it's such a shame how much the performance has been affected by recent macOS. I kept my 2019 Mac Book Pro on Catalina for years because everyone else was complaining... finally upgraded directly to Sonoma and the difference in speed was night and day!
Sounds a bit like my Intel MBP, in particular after they (the company I work for) installed all the lovely bloatware/tracking crap IT thinks we need to be subjected to. Most of the day the machine runs with the fans blasting away.
Still doesn't take a minute to launch Outlook, but I understand your pain.
I keep hoping it will die, because it would be replaced with an M-series MBP and they are way, way, WAY faster than even the best Intel MBP.
That’s not an issue with Macboom but with MS. MS has an incentive to deliver such a terrible experience on macs.
MS has literally thousands of managers running Outlook and Teams on their company-provided ARM MacBooks daily.
> Even if Microsoft on Mac is an unmitigated dumpster fire, this is ridiculous.
It is Microsoft. I could rant all day about the dumpster fire that is the "NEW Microsoft Teams (Work or School)"
It's like the perfect shining example of how MS doesn't give a flaming fuck about their end users.
I will pile on on MS Teams. I am on a Mac and periodically have to fight it because it went offline on me for some reason and I am no longer getting messages. Slightly less annoying is when my iPhone goes to sleep and Teams on my iPhone then sets my status to "Away", even though I am actively typing on Teams on my computer.
And while my particular problems might be partially because I am on MacOS, I observe Windows-using colleagues have just as many problems joining meetings (either total refusal, no audio, or sharing issues). So I think using Teams as a measure of any computer is probably not warranted.
I suppose you like bloatware and ads in your taskbar and 49 years of patch Tuesday. Have fun with that. I’ll take Mac over any windows.
Teams is shit, and hangs and crashes on my Mac. I blame Microsoft for that.
Outlook (old) is okay on Mac Teams is a dumpster fire on every platform
This hasn't been true for a long time.
not machines learning Devs
no, no, NO and yes.
I actually rejected a job offer when heard I will be given a macbook pro.
Apple, been the most closed company these days, should be avoided as much as you can, not to mention its macos is useless for linux developers like me, anything else is better.
its keyboard is dumb to me(that stupid command/ctrl key difference), can not even mouse-select and paste is enough for me to avoid Macos at all costs.
> I actually rejected a job offer when heard I will be given a macbook pro.
Probably best for you both.
> I actually rejected a job offer when heard I will be given a macbook pro.
For what it's worth, I've had a good success rate at politely asking to be given an equivalent laptop I can put linux on, or provide my own device. I've never had to outright reject an offer due to being required to use a Mac. At worst I get "you'll be responsible for making our dev environment work on your setup".
I've had 50/50. These days I'm fairly okay with just taking the Macbook Pro. I did have one instance where I got one my first week and used my Dell XPS with Linux the entire 10 months I was at the place. I returned the Macbook basically unused.
Only one time did I interview with a place where I asked if I'd be given a choice what hardware/OS I could use. The response was "We use Windows". My response was, "no we do not. Either I will not be using Windows with you, or I will not be using Windows NOT with you". I didn't get an offer. I was cool with it.
I think I had similar feelings but took an open mind and love my m2 pro Sometimes an open mind reaps rewards friend
I selected Mac + iOS devices when a job offered a choice, specifically to try out the option, while personally sticking with Windows and Android.
Now the performance of Mx Macs convinced me to switch, and I'll die on the hill of Android for life
> its keyboard is dumb to me(that stupid command/ctrl key difference)
Literally best keyboard shortcuts out of all major OSes. I don't know what weird crab hands you need to have to comfortably use shortcuts on Windows/Linux. CMD maps PERFECTLY on my thumb.
what amazing laptop must an employer give you to not be summarily rejected?
any thing runs Linux,even wsl2 is fine,no macos is the key. and yes it costs the employer about half of the expensive Apple devices that can not even be upgraded, its hardware is as closed as its software.
Employers typically also care about costs like “how hard is it to provision the devices” and “how long is the useful life of this” or “can I repurpose an old machine for someone else”.
Provisioning is a place where Windows laptops win hands down, though.
Pretty much everything going wrong with provisioning involves going extra weird on hw (usually for cheap supplier) and/or pushing weird third party "security" crapware.
macOS is clearly better for linux devs than Windows, given it is unix under-the-hood.
I don't even know what you mean by mouse-select and paste.
> "I don't even know what you mean by mouse-select and paste."
Presumably they mean linux-style text select & paste, which is done by selecting text and then clicking the middle mouse button to paste it (no explicit "copy" command).
macOS doesn't have built-in support for this, but there are some third-party scripts/apps to enable it.
For example: https://github.com/lodestone/macpaste
On Windows these days, you get WSL, which is actual Linux, kernel and all. There are still some differences with a standalone Linux system, but they are far smaller than macOS, in which not only the kernel is completely different, but the userspace also has many rather prominent differences that you will very quickly run afoul of (like different command line switches for the same commands).
Then there's Docker. Running amd64 containers on Apple silicon is slow for obvious reasons. Running arm64 containers is fast, but the actual environment you will be deploying to is almost certainly amd64, so if you're using that locally for dev & test purposes, you can get some surprises in prod. Windows, of course, will happily run amd64 natively.
> "userspace also has many rather prominent differences ... (like different command line switches for the same commands)."
Very quickly remedied by installing the GNU versions of those commands, ie: "brew install coreutils findutils" (etc)
Then you'll have exactly the same command line switches as on Linux.
> the actual environment you will be deploying to is almost certainly amd64
that’s up to your team of course, but graviton is generally cheaper than x86 instances nowadays and afaik the same is true on google and the other clouds.
Arm is an ISA, not a family of processors. You may expect Apple chips and Graviton to be wildly different, and perform completely different in the same scenario. In fact, most Arm cpus also have specific extensions that are not found in other manufacturers. So yes, while both recognize a base set of instructions, thats about it - expect that everything else is different. I know, amd64 is also technically an ISA, but you have 2 major manufacturers, with very similar and predictable performance characteristics. And even then, sometimes something on AMD behaves quite differently from Intel.
For most devs, doing crud stuff or writing high-level scripting languages, this isn't really a problem. For some devs, working on time-sensitive problems or with strict baseline performance requirements, this is important. For devs developing device drivers, emulation can only get you so far.
What are you responding to here?
No, I said you won’t always be deploying on amd64. Because arm64 is now the cheapest option and generally faster than the sandy bridge vcpu unit that amd64 instances are indexed against (and really, constrained to, intentionally, by AWS).
I never said anything about graviton not being arm64.
Its not about price, its about compatibility. Just because software compiles in a different ISA doesnt mean it behaves the same way. But if that isn't obvious to you, good for you.
On most Linux environments: text you highlight with the mouse (or highlight by double/triple clicking) can be "pasted" by middle-clicking.
And it's a separate clipboard from Ctrl+C/right-click-and-copy. The number of times I miss that on non-Linux...
Personally, I use tmux on both Linux and macOS to get multiple clipboards and the mouse behaviour I’m used to.
[dead]
Engineers loving tools is peak HN :).
No, no, no, yes.
If we are honest vanity signaling is a large part of it. Basically the Gucci bag equivalent for techies.
Honestly not. My tests run WAY faster on Apple Silicon, that's all I care about.
Not being contrarian, but what are you comparing?
M* has caused nothing but trouble for most mac user engineers I know (read: most engineers I know) who upgraded. Now not only are they building software for a different OS, they're building for a different architecture! They do all of their important compute in Docker, wasting CPU cycles and memory on the VM. All for what: a nice case? nice UI (that pesters you to try Safari)?
It looks like Apple's silicon and software is really good for those doing audio/video. Why people like it for dev is mostly a mystery to me. Though I know a few people who don't really like it but are just intimidated by Linux or just can't handle the small UX differences.
I'm an engineer that has both an apple silicon laptop (mbp, m2) and a linux laptop (arch, thinkpad x1 yoga.) I choose the mac every day of the week and it's not even close. I'm sure it's not great for specific engineering disciplines, but for me (web, rails, sre) it really can't be beat.
The UX differences are absolutely massive. Even after daily-driving that thinkpad for months, Gnome always felt kinda not quite finished. Maybe KDE is better, but it didn't have Wayland support when I was setting that machine up, which made it a non-starter.
The real killer though is battery life. I can work literally all day unplugged on the mbp and finish up with 40-50% remaining. When i'm traveling these days, i don't even bring a power cable with me during the day. The thinkpad, despite my best efforts with powertop, the most aggressive frequency scaling i could get, and a bunch of other little tricks, lasts 2 hours.
There are niceties about Linux too. Package management is better and the docker experience is _way_ better. Overall though, i'd take the apple silicon macbook 10 times out of 10.
Battery life followed by heat and fan noise have been my sticking points with non-mac laptops.
My first gen ThinkPad Nano X1 would be an excellent laptop, if it weren’t for the terrible battery life even in power save mode (which as an aside, slows it down a lot) and its need to spin up a fan to do something as trivial as driving a rather pedestrian 2560x1440 60hz display.
It feels almost like priorities are totally upside down for x86 laptop manufacturers. I totally understand and appreciate that there are performance oriented laptops that aren’t supposed to be good with battery life, but there’s no good reason for there being so few ultraportable and midrange x86 laptops that have good battery life and won’t fry your lap or sound like a jet taking off when pushed a little. It’s an endless sea of mediocrity.
> The thinkpad, […], lasts 2 hours.
This echoes my experiences for anything that needs power management. Not just that the battery life is worse, but that it degrades quickly. In two years it’s barely usable. I’ve seen this with non-Apple phones and laptops. iPhone otoh is so good these days you don’t need to upgrade until EOL of ~6 years (and even if you need it battery is not more expensive than any other proprietary battery). My last MacBook from 2011 failed a couple of years ago only because of a Radeon GPU inside with a known hw error.
> There are niceties about Linux too.
Yes! If you haven’t tried in years, the Linux desktop experience is awesome (at least close enough) for me – a dev who CAN configure stuff if I need to but find it excruciatingly menial if it isn't related to my core work. It’s really an improvement from a decade ago.
I'd like to offer a counterpoint, I have an old'ish T480s which runs linuxmint, several lxd containers for traefik, golang, python, postgres and sqlserver (so not even dockerized, but full VMs running these services), and I can go the whole morning (~4-5 hours).
I think the culprit is more likely the power hungry intel CPU in your yoga?
Going on a slight tangent; I've tried but do not like the mac keyboards, they feel very shallow to me, hence why I'm still using my old T480s. The newer thinkpad laptop keyboards all seem to be going that way though (going thinner), much to my dismay. Perhaps a P14s is my next purchase, despite it's bulk.
Anybody with a framework 13 want to comment on their keyboard?
I really like the keyboards on my frameworks. I have both the 13 and the new 16, and they are pretty good. Not as good as the old T4*0s I'm afraid, but certainly usable.
Interesting. I do similar (lots of Rails) but have pretty much the opposite experience (other than battery life - Mac definitely wins there). Though I use i3/Sway more than Gnome. The performance of running our huge monolith locally is much better for Linux users than Mac users where I work.
I used a Mac for awhile back in 2015 but it never really stood out to me UX-wise, even compared to Gnome. All I really need to do is open a few windows and then switch between them. In i3 or Sway, opening and switching between windows is very fast and I never have to drag stuff around.
This is going to change once Arm on Linux becomes a thing with Qualcomm's new jazz. I am mostly tethered to a dock with multiple screens. I have been driving Ubuntu now for over 4 years full time for work.
>The UX differences are absolutely massive.
Examples?
In my experience as a backend services Go developer (and a bit of Scala) the switch to arm has been mostly seamless. There was a little config at the beginning to pull dual-image docker images (x64 and arm) but that was a one time configuration. Otherwise I'm still targeting Linux/x64 with Go builds and Scala runs on the JVM so it's supported everywhere anyway; they both worked out of the box.
My builds are faster, laptop stays cooler, and battery lasts longer. I love it.
If I was building desktop apps I assume it would be a less pleasant experience like you mention.
The pain for me has been in the VM scene, as VirtualBox disappeared from the ecosystem with the switch to ARM.
Interestingly enough, the trend I am seeing is all the MacBook engineers moving back to native development environments. Basically, no longer using docker. And just as expected, developers are getting bad with docker and are finding it harder to use. They are getting more and more reliant on devops help or to lean on the team member who is on Linux to handle all of that stuff. We were on a really great path for a while there in development where we were getting closer to the ideal of having development more closely resemble production, and to have developers understand the operations tools. Now we're cruising firmly in the opposite direction because of this Apple switch to arm. Mainly it wouldn't bother me so much if people would recognize that they are rationalizing because they like the computers, but they don't. They just try to defend logically a decision they made emotionally. I do it too, every human does, but a little recognition would be nice.
It's not even a problem with MacBooks as such. They are still excellent consumer devices (non-casual gaming aside). It's this weird positioning of them as the ultimate dev laptop that causes so many problems, IMO.
Why would excellent machine be blamed for shitty software?
Because machines are tools meant to perform tasks, and part of that is being interoperable with other tools and de facto standards in the relevant field. For dev work, today, MacBook is not good at it.
Remember, though, that the binaries deployed in production environments are not being built locally on individual developer machines, but rather in the cloud, as reproducible builds securely deployed from the cloud to the cloud.
Modern language tooling (Go, Rust et al) allows one to build and test on any architecture, and the native macOS virtualization (https://developer.apple.com/documentation/virtualization) provides remarkably better performance compared to Docker (which is a better explanation for its fading from daily use).
Your "trend" may, in fact, not actually reflect the reality of how cloud development works at scale.
And I don't know a single macOS developer that "lean(s) on the team member who is on Linux" to leverage tools that are already present on their local machine. My own development environments are IDENTICAL across all three major platforms.
Virtualization and Docket are orthogonal technologies. The reason you use docker, especially in dev, is to have the exact same system libraries, dependencies, and settings on each build. The reason you use virtualization is to access hardware and kernel features that are not present on your hardware or native OS.
If you deploy on docker (or Kubernetes) on Linux in production, then ideally you should be using docker on your local system as well. Which, for Windows or MacOS users, requires a Linux VM as well.
It seems that you're trying to "educate" me on how containers and virtualization work, when in fact I've been doing this for a while, on macOS, Linux and Windows (itself having its own Hyper-V pitfalls).
I know you mean well, though.
There is no Docker on macOS without a hypervisor layer - period - and a VM, though there are multiple possible container runtimes not named Docker that are suitable for devops-y local development deployments (which will always, of course, be constrained in comparison to the scale of lab / staging / production environments). Some of these can better leverage the Rosetta 2 translation layer that Apple provides, than others.
I'm sorry that I came up as patronizing, I was more so trying to explain my confusion and thought process rather than to teach you about virtualization and containers.
Specifically what confused me in your comment was that you were saying Docker on Mac was superseded by their new native virtualization, which just doesn't make sense to me, for the reasons I was bringing up. I still don't understand what you were trying to say; replacing docker with podman or containerd or something else still doesn't have anything to do with virtualization or Rosetta, or at least I don't see the connection.
I should also say that I don't think anyone really means specifically docker when they talk about it, they probably mean containerization + image repos in general.
I don’t know a single engineer who had issues with M chips, and most engineers I know (me included) benefited considerably from the performance gains, so perhaps your niche isn’t that universal?
My niche is Ruby on Rails web dev, which is definitely not universal, but not all that narrow either!
You must have an unusual setup because, between Rosetta and rosetta in Virtualization.framework VMs (configurable in Docker Desktop or Rancher Desktop), I’ve never had issues running intel binaries on my Mac
I’m doing Ruby on Rails dev too. I don’t notice a hige difference between macOS and Linux for how I work.
There’s quirks to either OS.
Eg when on Gnome it drives me mad that it won’t focus a recently launched apps.
On macOS it annoys me that I have install a 3rd party util to move windows around.
Meh, you just adapt after a while.
what's wrong w/ Rails on M chips? I don't recall having had much trouble with it (except w/ nokogiri bindings right when the M1 was first available, but that's a given for any new release of OSX)
We have to cross-compile anyway because now we're deploying to arm64 Linux (AWS Graviton) in addition to x86 Linux.
So even if all developers of your team are using Linux, unless you want to waste money by ignoring arm64 instances on cloud computing, you'll have to setup cross compilation.
1) macs are by far the best hardware and also performance running intel code is faster than running intel code on the previous intel macs: https://discourse.slicer.org/t/hardware-is-apple-m1-much-fas... 2) they should use safari to keep power usage low and browser diversity high
It's basically required for iOS development. Working around it is extremely convoluted any annoying
I forgot to mention that as an obvious exception. Of course developing for Apple is best on Apple hardware.
I strongly suggest putting in the time to learn how to install and maintain a linux laptop ... Ubuntu 24.04 is a great engineer platform
It is, provided that the hardware vendor has reasonably decent support for power management, and you're willing to haul around an AC adapter if not. In general, I really like AMD hardware with built-in graphics for this, or alternately, Intel Tiger Lake-U based hardware.
Asahi Linux is shockingly great on Apple Silicon hardware, though.
I disagree.
Apple is selling hardware and scaling AI by utilizing it is simply a smart move.
Instead of building huge GPU clusters, having to deal with NVIDIA for GOUs (Apple kicked NVIDIA out years ago because of disagreements), Apple is building mainly on existing hardware.
This is in other terms utilizing CPU power.
On the other hand this helps their marketing keeping high price points when Apple now is going to differentiate their COU power and therefore hardware prices over AI functionality correlating with CPU power. This is also consistent with Apple stopping the MHz comparisons years ago.
Did you reply to the right comment? Feels like we’re talking about different things altogether.
What AI is Apple scaling?
Seen MLX folks post on X about nice results running local LLMs. https://github.com/ml-explore/mlx
Also, Siri, and consider: you’re scaling AI on apple’s hardware, too, you can develop your own local custom AI on it, there’s more memory available for linear algebra in a maxed out MBP than the biggest GPUs you can buy.
They scale the VRAM capacity with unified memory and that plus a ton of software is enough to make the Apple stuff plenty competitive with the corresponding NVIDIA stuff for the specific task of running big AI models locally.
> there’s more memory available for linear algebra in a maxed out MBP than the biggest GPUs you can buy.
But this hardly applies to 95% if not more people of all people running Apple's hardware, the fastest CPU/GPU isn't worth much if you can fit any at least marginally useful LLM model on the 8GB (or less on iPhones/iPads) of memory that you device has?
>Even in the higher end products like the MacBooks you see a lot of professionals (engineers included) who choose it because of its price-performance-value, and who don’t give a shit about luxury.
Most CS professionals who write code have no idea what it takes to build a desktop, so the hardware that they chose is pretty much irrelevant because they aren't specifically choosing for hardware. The reason Apple gets bought is mostly by anyone, including tech people, is because of ecosystem. The truth is, nobody really care that much about actual specs as long as its good enough to do basic stuff, and when you are indifferent to the actual difference but all your friends are in the ecosystem, the choice is obvious.
You can easily see this yourself: ask these "professionals" about the details of the Apple Neural engine, and its a very high chance that they will repeat some marketing material, while failing to mention that Apple does not publish any real docs for ANE, you have to sign your code to run on ANE, and you have to basically use Core ML to utilize the ANE. I.e if they really cared about inference, all of them would be buying laptops with discrete 4090s for almost the same price.
Meanwhile, if you look at people who came from EE/ECE (who btw on the average are far better coders than people with CS background, based on my 500+ interviews in the industry across several sectors), you see a way larger skew towards Android/custom built desktops/windows laptops running Linux. If you lived and breathed Linux and low level OS, you tend appreciate all the power and customization that it gives you because you don't have to go learn how to do things.
Coming from both environments, I'd be wary of making some of these assertions, especially when you consider that any ecosystem that optimizes software and hardware together (from embedded devices all the way to general-purpose computing machines) is generally going to perform well, given the appropriate engineering focus. This applies regardless of (RT)OS / hardware choice, i.e., it's simply common sense.
The signing of binaries is a part of adult developer life, and is certainly required for the platforms you mention as well.
Unquestionably, battery life on 4090-based laptops sucks on a good day, and if you're working long hours, the last thing you want to have to do is park yourself next to your 350W adapter just to get basic work done.
>specially when you consider that any ecosystem that optimizes software and hardware together (from embedded devices all the way to general-purpose computing machines) is generally going to perform well, given the appropriate engineering focus.
Very much not true. Not to make this personal, but this is exactly what Im talking about Apple fans not understanding hardware.
Linux has been through the ringer of fighting its way to general use, and because of its open source nature and constant development. So in terms of working well, it has been optimized for hardware WAY further than Apple, which is why you find it on servers, personal desktops, phones, portable gaming devices, and even STM32 Cortex bldc control boards, all of which run different hardware.
Apple doesn't optimize for general use, it optimizes for a specific business case. In the case of Apple silicon, it was purely battery life which brings more people in to the ecosystem. Single core performance is on par with all the other chips, because the instruction set doesn't actually matter (https://chipsandcheese.com/2021/07/13/arm-or-x86-isa-doesnt-...), multi core is behind, Mac Os software is still a pile of junk (Rosetta still isn't good across the board), computers are not repairable, you have no privacy since Apple collects a shitload of telemetry for themselves, e.t.c and so on.
And, Apple has no incentive to make any of this better - prior to Apple Silicon, people were still buying Intel Macs with worse specs and performance for the same price, all for the ecosystem and vanity. And not only was the Macos still terrible (and much slower), you also had hardware failures like plugging in a wrong USBC hub would blow the chip and brick your Mac, butterfly keyboards failing, and questionable decisions like virtual esc keys.
>The signing of binaries is a part of adult developer life,
...for professional use, and the private key holder should be the person who wrote that software. I hope you understand how ridiculous it is to ask a developer to sign code using the manufacturers key to allow them to run that code on a machine that they own.
>Unquestionably, battery life on 4090-based laptops sucks on a good day,
Well yea, but you are not buying that laptop for battery life. Also, with Ryzen cpus and 4090s, most get like 6-8 hours depending on use due to Nvidia Prime, which is pretty good for travel, especially if you have a backpack with a charging brick.
If you want portability, there are plenty of lighter weight option like Lenovo Yoga which can get 11-12 hours of battery life for things like web browsing.
Macbooks are not bang-for-buck. Most engineers I know buy it because it's like Windows but with Unix tools built-in.
I would be interested if there exists a single better value machine in $ per hour than my partners 2012 MacBook Air, which still goes
Any decent laptop from the same era. My parents are using both HP ProBooks and Lenovo Thinkpads from that era currently and they are working perfectly and maintenance costs are lower than the same era macbooks...
I own a MacBook Air, I won't be buying another purely because the moment I need to upgrade anything or repair anything it's effectively ewaste.
Not found any good proxy which works well with cisco VPN software. Charles and proxyman work intermittently at best and require disconnecting from the VPN and various such dances.
Fiddler on windows works flawlessly.
> Somewhat true but things are changing. While there are plenty of “luxury” Apple devices like Vision Pro or fully decked out MacBooks for web browsing we no longer live in a world where tech are just lifestyle gadgets.
I notice your use of the weasel word "just".
We undoubtedly live in a world where Apple products are sold as lifestyle gadgets. Arguably it's more true today than it ever was. It's also a world where Apple's range of Veblen goods managed to gain footing in social circles to an extent that we have kids being bullied for owning Android phones.
Apple's lifestyle angle is becoming specially relevant because they can no longer claim they sell high-end hardware, as the difference in specs between Apple's hardware and product ranges from other OEMs is no longer noticeable. Apple's laughable insistence on shipping laptops with 8GB of RAM is a good example.
> Even in the higher end products like the MacBooks you see a lot of professionals (engineers included) who choose it because of its price-performance-value, and who don’t give a shit about luxury.
I don't think so, and that contrasts with my personal experience. All my previous roles offered a mix of MacBooks and windows laptops, and MacBooks were opted by new arrivals because they were seen as perks and the particular choice of windows ones in comparison were not as impressive, even though they out-specced Apple's offering (mid-range HP and Dell). In fact in a recent employee's review their main feedback was that the MacBook pro line was under-specced because at best it shipped with only 16GB of RAM while the less impressive HP ones already came with 32GB. In previous years, they called for the replacement of the MacBook line due to the rate of keyboard malfunctions. Meaning, engineers were purposely picking the underperforming option for non-technical reasons.
I bought my first Apple product roughly 11 years ago explicitly because it had the best accessibility support at the time (and that is still true). While I realize you only see your slice of the world, I really cringe when I see the weasel-word "lifestyle". This "Apple is for the rich kids"-fairytale is getting really really old.
Apparently you’ve never used Apple Silicon. There’s no PC equivalent in terms of specs.
Also, I think you’re misunderstanding what a Veblen good is and the difference between “premium” and “luxury.” Apple does not create luxury or “Veblen” goods like for example, LVMH.
An easy way to discern the difference between premium and luxury — does the company advertise the product’s features or price?
For example, a Chanel handbag is almost entirely divorced from its utility as a handbag. Chanel doesn’t advertise features or pricing, because it’s not about the product’s value or utility, it’s what it says about your personal wealth that you bought it. That’s a Veblen good.
Apple heavily advertises features and pricing. Because they sell premium products that are not divorced from their utility or value.
price-performance is not a thing for a vast majority of users. Sure I'd like a $40k car but I can only afford a $10k car. It's not nice but it gets me from a to b on my min-wage salary. Similarly, I know plenty of friends and family. They can either get 4 macs for $1000 each (mom, dad, sister, brother) so $4k. Or they can get 4 windows PCs for $250 so $1k total.
The cheap Windows PCs suck just like a cheap car sucks (ok, they suck more), but they still get the job done. You can still browse the web, read your email, watch a youtube video, post a youtube video, write a blog, etc.. My dad got some HP celeron. It took 4 minutes to boot. It still ran though and he paid probably $300 for it vs $999 for a mac. He didn't have $999.
I’m not saying one or the other is better for your family members. But MacBooks last very long. We'll see about the M series but for myself for instance I got the M1 air without fans, which has the benefit of no moving pieces or air inlets, so even better. My last one, a MBP from 2011 lasted pretty much 10 years. OS updates are 8-10y.
> The cheap Windows PCs suck […], but they still get the job done
For desktop, totally. Although I would still wipe it with Ubuntu or so because Windows is so horrible these days even my mom is having a shit time with only browsing and video calls.
A random laptop however is a different story. Except for premium brands (closer to Apple prices) they tend to have garbage battery life, infuriating track pad, massive thermal issues, and preloaded with bloatware. Apple was always better here, but now with the lower power/heat of the ARM chips, they got soooo much better overnight.
> A random laptop however is a different story. Except for premium brands (closer to Apple prices) they tend to have garbage battery life, infuriating track pad, massive thermal issues, and preloaded with bloatware. Apple was always better here, but now with the lower power/heat of the ARM chips, they got soooo much better overnight.
To the person with no budget, all that doesn't matter. They'll still get let $250 laptop and put up with the garbage battery life (find a power outlet), infuriating trackpad (buy an external mouse for $10), bloatware (most users don't know this and just put up with it), etc....
I agree Apple is better. But if your budget is $250 and not $1k then you get what you can get for $250 and continue to feed your kids and pay your rent.
But also you don't have to buy new. If I had $250, an ancient MacBook might be better than a newer low-end windows laptop. Though for my purposes I'd probably get an oldish Chromebook and root it.
Most of Apple’s money comes from iPhones.
you can get a laptop with a much bigger screen and a keyboard for as little as 100 to 300$ and it will be much much easier to get work done on, than an apple phone. so i think apple is still very much a luxury product.
Spending your life on a phone is still a lifestyle "choice".
[flagged]
Countering a lazy reference with some weird racist stereotype was the best you could do?
It's not a stereotype, no one programs on phones here.
True, should have just said nonsense
Clumsily phrased. What I meant is that iPhones or similar priced smartphones are affordable and common for say middle class in countries with similar purchase power to Eastern European countries. You’d have to go to poorer countries like Vietnam or Indonesia for iPhones to be “out of reach”, given the immense value it provides.
Heck now I see even Vietnam iPhone is #1 vendor with a 28% market penetration according to statcounter. That’s more than I thought, even though I was just there…
Speaking of India, they’re at 4% there. That’s closer to being luxury.
I think US is their main market, though. The rest of the world prefers cheaper better phones and doesn't mind using WhatsApp for messaging, instead of iMessage.
As a single market, US is probably biggest. I’m seeing numbers that say that the “Americas” is a bit less than half of global revenue, and that would include Canada and all of South and Latin America. So the rest of the world is of course very important to Apple, at least financially.
> doesn't mind using WhatsApp for messaging
Well WhatsApp was super early and way ahead of any competition, and the countries where it penetrated had no reason to leave, so it’s not exactly like they settle for less. It has been a consistently great service (in the category of proprietary messaging apps), even after Zuck took over.
It's not about price-performance value at all. Mac is still the most expensive performance. And Apple is only particularly popular in the US. Android phones dominate most other markets, particularly poor markets.
Apple is popular in the US because a) luxury brands hold sway b) they goad customers into bullying non-customers (blue/green chats) and c) they limit features and customizability in favor of simpler interfaces.
It's popular with developers because a) performance is valuable even at Apples steep cost b) it's Unix-based unlike Windows so shares more with the Linux systems most engineers are targeting.
I have never been an apple fanboy. Till 2022, I was on android phones. Work issued either Thinkpad or XPS variants. However, I have owned apple books since 2004 starting from panther era. I sincerely believe that apple provides best features and performance combination in the given price for laptops.
Here I feel that I-hate-apple crowd is just stuck with this notion of luxury overpriced brand when it is clearly not the case. Apple has superior hardware at better price points. Last time I was doing shopping for a laptop, I could get similar features only at a 30% - 40% price premium in other brands.
I am typing this on an apple M2 air and try finding similar performance under 2000 USD in other brands. The responsiveness, the (mostly) sane defaults and superior rendering and fonts make it worth it. The OS does not matter so much as it used to do in 2004 and the fact that I have a unix terminal in 2024 is just incidental. I have turned off auto updates and I do not use much of phone integration apart from taking backups and photo copying.
I switched to an iPhone in 2022 from a 200 US$ Samsung handset. Here, I would say that not everyone needs an iPhone. My old phone used to do all the tricks I need on this one. However, the camera is really and photos are really great. If I buy an iPhone next time, it would be just for the photos it takes.
> > In case it is not abundantly clear by now: Apple's AI strategy is to put inference (and longer term even learning) on edge devices. This is completely coherent with their privacy-first strategy (which would be at odds with sending data up to the cloud for processing).
> Their primary business goal is to sell hardware.
There is no contradiction here. No need for luxury. Efficient hardware scales, Moore's law has just been rewritten, not defeated.
Power efficiency combined with shared and extremely fast RAM, it is still a formula for success as long as they are able to deliver.
By the way, M-series MacBooks have crossed bargain territory by now compared to WinTel in some specific (but large) niches, e.g. the M2 Air.
They are still technically superior in power efficiency and still competitive in performance in many common uses, be it traditional media decoding and processing, GPU-heavy tasks (including AI), single-core performance...
By the way, this includes web technologies / JS.
This is it. An M series air is an incredible machine for most people - people who likely won’t ever write a line of js or use a GPU. Email, banking, YouTube, etc ona device with incredible battery and hardware that will likely be useful for a decade is perfect. The average user hasn’t even heard of HN.
It's great for power users too. Most developers really enjoy the experience of writing code on Macs. You get a Unix based OS that's just far more usable and polished than a Linux laptop.
If you're into AI, there's objectively literally no other laptop on the planet that is competitive with the GPU memory available on an MBP.
It's an amazing machine for engineers too.
Are newer airs good enough for development?
depends on your workload. RAM and passive cooling are the most likely issues but afaik an M2/M3 with 16GiB still performs a lot better than an similarly priced x64 laptop. Active cooling doesn't mean no throttling either.
If you don't explicitly want a laptop, a 32GB M2 Pro Mac Mini would be a good choice I think.
Personally i only have used MBPs so far.
But the M-series Air are not remotely comparable to the old Intel Airs, that's for sure :)
It doesn't need to stay true forever.
The alternative is Google / Android devices and OpenAI wrapper apps, both of which usually offer a half baked UI, poor privacy practices, and a completely broken UX when the internet connection isn't perfect.
Pair this with the completely subpar Android apps, Google dropping support for an app about once a month, and suddenly I'm okay with the lesser of two evils.
I know they aren't running a charity, I even hypothesized that Apple just can't build good services so they pivoted to focusing on this fake "privacy" angle. In the end, iPhones are likely going to be better for edge AI than whatever is out there, so I'm looking forward to this.
> The alternative is Google / Android devices
No, the alternative is Android devices with everything except firmware built from source and signed by myself. And at the same time, being secure, too.
You just can't have this on Apple devices. On Android side choices are limited too, I don't like Google and especially their disastrous hardware design, but their Pixel line is the most approachable one able to do all these.
Heck, you can't even build your own app for your own iPhone without buying another hardware (a Mac, this is not a software issue, this is a legal issue, iOS SDK is licensed to you on the condition of using on Apple hardware only) and a yearly subscription. How is this acceptable at all?
> No, the alternative is Android devices with everything except firmware built from source and signed by myself
Normal users will not do this. Just because many of the people here can build and sign a custom Android build doesn't mean that is a viable commercial alternative. It is great that is an option for those of us who can do it, but don't present it as a viable alternative to the iOS/Google ecosystems. The fraction of people who can and will be willing to do this is really small. And even if you can do it, how many people will want to maintain their custom built OSes?
> Normal users will not do this. J
Unfortunately a lot of the "freedom" crowd think that unless you want to be an 80s sysadmin you don't deserve security or privacy. Or computers.
the main reason the masses don't have privacy and security-centred systems is that they don't demand them and they will trade it away for a twopence or for the slightest increment in convenience
a maxim that seems to hold true at every level of computing is that users will not care about security unless forced into caring
with privacy they may care more, but they are easily conditioned to assume it's there or that nothing can be realistically be done about losing it
[dead]
I, an engineer, am not doing this myself, too. There is a middle ground though: just use a privacy-oriented Android build, like DivestOS. [1]
There are a couple caveats:
1. It is still a bit tricky for a non-technical person to install. Should not be a problem if they know somebody who can help, though. There's been some progress making the process more user friendly recently (e.g. WebUSB-based GrapheneOS installer).
2. There are some papercuts if you don't install Google services on your phone. microG [2] helps with most but some still remain. My main concern with this setup is that I can't use Google Pay this way, but having to bring my card with me every time seems like an acceptable trade off to me.
[1]: https://divestos.org/
[2]: https://microg.org/
The biggest problem with these kinds of setups is usually the banking apps which refuse to run if it's not "safe".
WebGPU and many other features on iOS are unimplemented or implemented in half-assed or downright broken ways.
These features work on all the modern desktop browsers and on Android tho!
> WebGPU and many other features
WebGPU isn't standardized yet. Hell, most of the features people complain about aren't part of any standard, but for some reason there's this sense that if it's in Chrome, it's standard - as if Google dictates standards.
> but for some reason there's this sense that if it's in Chrome, it's standard - as if Google dictates standards.
Realistically, given the market share of Chrome and Chromium based browsers, they kind of do.
I didn't like it when Microsoft dominated browsers, and I'm no happier now. I've stopped using Chrome.
Just curious – what are you using now?
I’ve been using Firefox since the Quantum version is out. It feels slightly slower to Chrome but it's negligible to me. Otherwise I can't tell a difference (except some heavy web based Office like solutions screaming 'Your browser is not supported!' but actually works fine).
Meanwhile, Apple has historically dictated that Google can't publish Chrome for iOS, only a reskinned Safari. People in glass-walled gardens shouldn't throw stones.
Firefox has an implementation of WebGPU, why is Safari missing in action?
> How is this acceptable at all?
Because as you described, the only alternatives that exist are terrible experiences for basically everyone, so people are happy to pay to license a solution that solves their problems with minimal fuss.
Any number of people could respond to “use Android devices with everything except firmware built from source and signed by myself” with the same question.
The yearly subscription is for publishing your app on Apple’s store and definitely helps keep some garbage out. Running your own app on your own device is basically solved with free third party solutions now (see AltStore and since a newer method I can’t recall atm)
Notice that parent never talked about publishing apps, just building and running apps on their own device. "Publishing on AltStore" (or permanently running the app on your own device in any other way) still requires a $100/year subscription as far as I'm aware.
Those are only available in the EU, and Apple has been huffing and puffing even here.
> No, the alternative is Android devices with everything except firmware built from source and signed by myself
I wouldn't bet on this long term, since it fully relies on Google hardware, and Google's long-term strategy is to remove your freedom piece by piece, cash on it, not to support it.
The real alternative is GNU/Linux phones, Librem 5 and Pinephone, without any ties to greedy, anti-freedom corporations.
> No, the alternative is Android devices with everything except firmware built from source and signed by myself. And at the same time, being secure, too.
There are people who don't know how to use file explorer, new generation grows up in a world of iPhones without ever seeing file system. Any other bright ideas?
> Heck, you can't even build your own app for your own iPhone without buying another hardware (a Mac, this is not a software issue, this is a legal issue, iOS SDK is licensed to you on the condition of using on Apple hardware only) and a yearly subscription. How is this acceptable at all?
Because they set the terms of use of the SDK? You're not required to use it. You aren't required to develop for iOS. Just because Google gives it all away for free doesn't mean Apple has to.
> You aren't required to develop for iOS.
Sure, as a SWE I'm not going to buy a computer unable to run my own code. A smartphone is an ergonomic portable computer, so I say no to iPhone and would like to remind others who didn't have a deep think into this about it.
> You aren't required to develop for iOS
Do you have a legal right to write software or run your own software for hardware you bought?
Because it’s very easy to take away a right by erecting aritificial barriers, just like how you could discriminate by race at work, but pretend you are doing something else,
> Do you have a legal right to write software or run your own software for hardware you bought?
I've never heard of such a thing. Ideally I'd like that, but I don't have such freedoms with the computers in my cars, for example, or the one that operates my furnace, or even for certain parts of my PC.
So you bought "a thing' but you can't control what it does, how it does it, you don't get to decide what data it collects or who can see that data.
You aren't allowed to repair the "thing' because the software can detect you changed something and will refuse to boot. And whenever it suits the manufacturer, they will decide when the 'thing' is declared out of support and stops functioning.
I would say you are not an owner then, you (and me) and just suckers that are paying for the party. Maybe it's a lease. But then we also pay when it breaks, so it more of a digital feudalism.
> Do you have a legal right to write software or run your own software for hardware you bought?
No, obviously not. Do you have a right to run a custom OS on your PS5? Do you have a right to run a custom application on your cable set-top box? Etc. Such a right obviously doesn’t exist and most people generally are somewhere between “don’t care” and actively rejecting it for various reasons (hacking in games, content DRM, etc).
It’s fine if you think there should be, but it continues this weird trend of using apple as a foil for complaining about random other issues that other vendors tend to be just as bad or oftentimes even worse about, simply because they’re a large company with a large group of anti-fans/haters who will readily nod along.
Remember when the complaint was that the pelican case of factory OEM tools you could rent (or buy) to install your factory replacement screen was too big and bulky, meaning it was really just a plot to sabotage right to repair?
https://www.theverge.com/2022/5/21/23079058/apple-self-servi...
> Remember when the complaint was that the pelican case of factory OEM tools you could rent (or buy) to install your factory replacement screen was too big and bulky, meaning it was really just a plot to sabotage right to repair?
Yes, I do. That was and continues to be a valid complaint, among all other anti-repair schemes Apple have come up with over the years. DRM for parts, complete unavailability of some commonly repaired parts, deliberate kneecapping of "Apple authorized service providers", leveraging the US customs to seize shipments of legitimate and/or unlabeled replacement parts as "counterfeits", gaslighting by official representatives on Apple's own forums about data recovery, sabotaging right to repair laws, and even denial of design issues[1] to weasel out of warranty repair just to name a few.
All with the simple anti-competitive goal of making third party repair (both authorized and independent) a less attractive option due to artificially increased prices, timelines to repair, or scaremongering about privacy.
https://arstechnica.com/gadgets/2022/12/weakened-right-to-re...
https://www.pcgamer.com/ifixit-says-apples-iphone-14-is-lite...
[1] Butterfly keyboards, display cables that were too short and failed over time
> Yes, I do. That was and continues to be a valid complaint,
No, it doesn’t - because you can simply not use the tools if you don’t want. You can just order a $2 spudger off Amazon if you want, you don’t need the tools at all.
It continues to be a completely invalid complaint that shows just how bad-faith the discussion about apple has become - it literally costs you nothing to not use the tools if you want, there is no downside to having apple make them available to people, and yet you guys still find a way to bitch about it.
Moreover, despite some “bold” proclamations from the haters… no android vendors ever ended up making their oem tooling available to consumers at all. You have to use the Amazon spudger on your pixel, and you will fuck up the waterproofing when you do your repair, because the android phone won’t seal properly against water without the tools either. IPX sixtywho!?
It’s literally a complete and total net positive: nothing was taken away from you, and you don’t need to use it, and it makes your life easier and produces a better repair if you want it. Apple went out of their way to both make the tooling available to normies who want to rent it or people who want to buy it for real. And people still bitch, and still think they come off better for having done so. Classic “hater” moment, in the Paul Graham sense. Anti-fanboys are real.
https://paulgraham.com/fh.html
Literally, for some people - the pelican cases with the tools are too big and heavy. And that’s enough to justify the hate.
Again, great example of the point I was making in the original comment: people inserting their random hobby horse issues using apple as a foil. You don’t like how phones are made in general, so you’re using apple as a whipping boy for the issue even if it’s not really caused or worsened by the event in question etc. Even if the event in question is apple making that issue somewhat better, and is done worse by all the other vendors etc. Can’t buy tooling for a pixel at all, doing those repairs will simply break waterproofing without it, and you’re strictly better off having the ability to get access to the tooling if you decide you want it, but apple offering it is a flashpoint you can exploit for rhetorical advantage.
> Moreover, despite some “bold” proclamations from the haters… no android vendors ever ended up making their oem tooling available to consumers at all. You have to use the Amazon spudger on your pixel, and you will fuck up the waterproofing when you do your repair, because the android phone won’t seal properly against water without the tools either. IPX sixtywho!?
I think the dirty little secret here is that an iPhone is just about the only phone, apart from maybe some of the really nice Google and Samsung flagships, that anyone wants to repair, because they're bloody expensive. Which is fine and dandy but then do kindly park your endless bemoaning of the subjects of e-waste and non-repairable goods, when Android by far and away is the worse side of that equation, with absolute shit tons of low yield, crap hardware made, sold, and thrown away when the first software update renders it completely unusable (if it wasn't already, from the factory).
Could you chill with the relentless insults? I'd appreciate it.
Perhaps you haven't noticed, but once you tally up overpriced parts together with their oversized, heavy, expensive rental of tools that you don't need, you end up with a sum that matches what you would pay to have it repaired by Apple - except you're doing all of the work yourself.
A curious consumer who has never repaired a device, but might have been interested in doing so, will therefore conclude that repairing their own device is 1. Far too complicated, thanks to an intimidating-looking piece of kit that they recommend, but is completely unnecessary, and 2. Far too expensive, because Apple prices these such that the repair is made economically nonviable.
So yes, I still believe that this is Apple fighting the anti-repair war on a psychological front. You're giving them benefit of the doubt even though they've established a clear pattern of behavior that demonstrates their anti-repair stance beyond any reasonable doubt - although you dance around the citations and claim that I'm being unreasonable about Apple genuinely making the repair situation "better".
Futhermore, I'm not a fanboy or anti-fanboy of any company. The only thing I'm an anti-fanboy of are anti-consumer practices. If Apple changed some of their practices I'd go out and buy an iPhone and a Macbook tomorrow.
The fact that I pointed out that Apple is hostile against repair does not mean that I endorse Google, Samsung, or any other brand - they all suck when it comes to repair, yet you're taking it as a personal attack and calling me names for it.
Please don't cross into flamewar and please avoid tit-for-tat spats in particular. They're not what this site is for, and they make boring reading.
https://news.ycombinator.com/newsguidelines.html
Excuse me? I'm clearly not the one who crossed into flamewar, please read the previous 1-2 comments.
edit: Describing others as "bitching", "bad faith", and "hater", runs afoul of half of the guidelines on this site. That comment somehow isn't moderated, but mine is called out for crossing into flamewar?
You were both breaking the site guidelines, and I replied to both of you in the same way.
Even if I hadn't, though, we need you to follow the rules regardless of what anybody else is doing, and the same goes for every user here. Pointing a finger at the other person isn't helpful.
I understand it can be difficult to have someone point out that you're not approaching a situation in good faith, but that's not exactly "relentless insults".
It can be difficult to handle the intimation that maybe there is the mirror image of the "brainless apple sheeple" too. Haters exist - people who are not able to approach a situation fairly and are just relentlessly negative because they hate X thing too. Negative parasocial attachment is just as much of a thing as a positive one.
And when you get to the point of "literally making the tools available is bad because the pelican cases are too heavy" you have crossed that rubicon. I am being very very polite here, but you are not being rational, and you are kinda having an emotional spasm over someone disagreeing with you on the internet.
Yes, if you want OEM parts and you rent OEM tooling it's probably going to come close to OEM cost. That isn't discriminatory, if the prices are fair+reasonable, and objectively they pretty much are. $49 to rent the tools, and have them shipped both ways, etc, is not an unreasonable ask.
Not having a viable business model for your startup doesn't mean the world is wrong. It means you don't have a viable business idea. And yeah, if you are renting the tools as a one-off, and counting your personal time as having some value (or labor cost in a business), then you probably are not going to get costs that are economical with a large-scale operator with a chain operation and an assembly-line repair shop with repair people who do nothing but repair that one brand. That's not Apple's fault.
What we ultimately come down to with your argument is "apple is killing right-to-repair by being too good at repair and providing too cheap repairs such that indie shops can no longer make a margin" and I'm not sure that's actionable in a social sense of preventing e-waste. People getting their hardware repaired cheaply is good. Long parts lifetimes are good. Etc.
Being able to swap in shitty amazon knockoff parts is a whole separate discussion, of course. And afaik that is going to be forced by the EU anyway, consequences be damned. So what are you complaining about here?
Please don't cross into flamewar and please avoid tit-for-tat spats in particular. They're not what this site is for, and they make boring reading.
https://news.ycombinator.com/newsguidelines.html
Actually to be fully clear, in many cases you have an anti-right: literally not only do you not have a right, but it’s illegal to circumvent technological restrictions intended to prevent the thing you want to do.
As noxious as that whole thing is, it’s literally the law. I agree the outcome is horrifying of course… stallman was right all along, it’s either your device or it’s not.
And legally speaking, we have decided it’s ok to go with “not”.
> better for edge AI than whatever is out there, so I'm looking forward to this
What exactly are you expecting? The current hype for AI is large language models. The word 'large' has a certain meaning in that context. Much larger that can fit on your phone. Everyone is going crazy about edge AI, what am I missing?
> Everyone is going crazy about edge AI, what am I missing?
If you clone a model and then bake in a more expensive model's correct/appropriate responses to your queries, you now have the functionality of the expensive model in your clone. For your specific use case.
The size of the resulting case-specific models are small enough to run on all kinds of hardware, so everyone's seeing how much work can be done on their laptop right now. One incentive for doing so is that your approaches to problems are constrained by the cost and security of the Q&A roundtrip.
Quantized LLMs can run on a phone, like Gemini Nano or OpenLLAMA 3B. If a small local model can handle simple stuff and delegate to a model in the data center for harder tasks and with better connectivity you could get an even better experience.
> If a small local model can handle simple stuff and delegate to a model in the data center for harder tasks and with better connectivity you could get an even better experience.
Distributed mixture of experts sounds like an idea. Is anyone doing that?
Sounds like an attack vector waiting to happen if you deploy enough competing expert devices into a crowd.
I’m imagining a lot of these LLM products on phones will be used for live translation. Imagine a large crowd event of folks utilizing live AI translation services being told completely false translations because an actor deployed a 51% attack.
I’m not particularly scared of a 51% attack between the devices attached to my Apple ID. If my iPhone splits inference work with my idle MacBook, Apple TV, and iPad, what’s the problem there?
what about in situations with no bandwidth?
Using RAG a smaller local LLM combined with local data (e.g. your emails, iMessages etc) can be useful than a large external LLM that doesn’t have your data.
No point asking GPT4 “what time does John’s party start?”, but a local LLM can do better.
This is why I think Apple’s implementation of LLMs is going to be a big deal, even if it’s not technically as capable. Just making Siri better able to converse (e.g. ask clarifying questions) and giving it the context offered by user data will make it dramatically more useful than silo’d off remote LLMs.
It fits on your phone, and your phone can offload battery burning tasks to nearby edge servers. Seems like the path consumer-facing AI will take.
Google has also been working on (and provides kits for) local machine learning on mobile devices... and they run on both iOS and Android. The Gemini App does send data in to Google for learning, but even that you can opt out of.
Apple's definitely pulling a "Heinz" move with privacy, and it is true that they're doing a better job of it overall, but Google's not completely horrible either.
You won't be able to run any model even as remotely as capable as Gemini locally though.
It's kind of tautological that Gemini Nano is as capable as Gemini and it runs locally.
https://store.google.com/intl/en/ideas/articles/pixel-featur...
Yeah, I was thinking of their non-local model like Gemini advanced though.
In any case iPhone probably don't have enough memory to run a 3.25B model? e.g. 15 pro only have 8 GB (and Gemini Nano seems to only work on the 12GB Pixel 8 Pro) and 14 has only 6GB, that hardly seems sufficient for even a small LLM if you still you want to run the full OS and other apps at the same time.
>subpar Android apps
Care to cite these subpar Android apps? The app store is filled to the brim with subpar and garbage apps.
>Google dropping support for an app about once a month
I mean if you're going to lie why not go bigger
>I'm okay with the lesser of two evils.
So the more evil company is the one that pulled out of China because they refused to hand over their users data to the Chinese government on a fiber optic silver plate?
Google operates in China albeit via their HK domain.
They also had project DragonFly if you remember.
The lesser of two evils is that one company doesn’t try to actively profile me (in order for their ads business to be better) with every piece of data it can find and forces me to share all possible data with them.
Google is famously known to kill apps that are good and used by customers: https://killedbygoogle.com/
As for the subpar apps: there is a massive difference between the network traffic when on the Home Screen between iOS and Android.
>Google operates in China albeit via their HK domain.
The Chinese government has access to the iCloud account of every Chinese Apple user.
>They also had project DragonFly if you remember.
Which never materialized.
>The lesser of two evils is that one company doesn’t try to actively profile me (in order for their ads business to be better) with every piece of data it can find and forces me to share all possible data with them.
Apple does targeted and non targeted advertising as well. Additionally, your carrier has likely sold all of the data they have on you. Apple was also sued for selling user data to ad networks. Odd for a Privacy First company to engage in things like that.
>Google is famously known to kill apps that are good and used by customers: https://killedbygoogle.com/
Google has been around for 26 years I believe. According to that link 60 apps were killed in that timeframe. According to your statement that Google kills an app a month that would leave you 252 apps short. Furthermore, the numbers would indicate that Google has killed 2.3 apps per year or .192 apps per month.
>As for the subpar apps: there is a massive difference between the network traffic when on the Home Screen between iOS and Android.
Not sure how that has anything to do with app quality, but if network traffic is your concern there's probably a lot more an Android user can do than an iOS user to control or eliminate the traffic.
> Google has been around for 26 years I believe. According to that link 60 apps were killed in that timeframe. According to your statement that Google kills an app a month that would leave you 252 apps short. Furthermore, the numbers would indicate that Google has killed 2.3 apps per year or .192 apps per month.
Most of the "Services" on that list are effectively apps, too:
VPN by Google One, Album Archive, Hangouts, all the way back to Answers, Writely, and Deskbar.
I didn't touch hardware, because I think that should be considered separately.
The first of 211 services on that site was killed in 2006.
The first of the 60 apps on that site was killed in 2012.
So even apps alone, 4.28 a year.
But more inclusively, 271 apps or services in 17 years is ~16/year, over one a month.
You need to remind yourself of the site guidelines about assuming the worst. Your comments just come across condescendingly.
>Most of the "Services" on that list are effectively apps, too:
Even with the additional apps you've selected it still doesn't come close to the one app per month claim.
>I didn't touch hardware, because I think that should be considered separately.
So why even mention it? Is Apple impervious to discontinuing hardware?
>The first of 211 services on that site was killed in 2006.
So we're talking about services now? Or apps? Or apps and services? The goal posts keep moving.
>You need to remind yourself of the site guidelines about assuming the worst. Your comments just come across condescendingly.
I suggest you also consult the guidelines in regards to calling people names. My comments were never intended to be inferred that way.
> The Chinese government has access to the iCloud account of every Chinese Apple user.
Source?
I think it was Paul Thurrott on Windows Weekly podcast who said that all these companies don't really care about privacy. Apple takes billions of dollar a year to direct data towards Google via the search defaults. Clearly privacy has a price. And I suspect it will only get worse with time as they keep chasing the next quarter.
Tim Cook unfortunately is so captured in that quarterly mindset of 'please the share holders' that it is only a matter of time.
It doesn’t matter to me if they “really care” about privacy or not. Megacorps don’t “really care” about anything except money.
What matters to me is that they continue to see privacy as something they can sell in order to make money.
The Google payments are an interesting one; I don't think it's a simple "Google pays them to prefer them", but a "Google pays them to stop them from building a competitor".
Apple is in the position to build a competing search product, but the amount Google pays is the amount of money they would have to earn from it, and that is improbable even if it means they can set their own search engine as default.
Apple isn't ethically Mullvad, but they're much better than some of their android competitors who allow adverts on the lock screen.
> but it is about selling luxury hardware.
While Apple is first and foremost a hardware company, it has more or less always been about the "Apple experience". They've never "just" been a hardware company.
For as long as Apple has existed, they've done things "their way" both with hardware and software, though they tend to want to abstract the software away.
If it was merely a question of selling hardware, why does iCloud exist ? or AppleTV+, or Handoff ? or iMessage, or the countless other seemingly small life improvements that somehow the remainder of the industry cannot seem to figure out how to do well.
Just a "simple" thing as switching headphones seamlessly between devices is something i no longer think about, it just happens, and it takes a trip with a Windows computer and a regular bluetooth headset to remind me how things used to be.
As part of their "privacy first" strategy, iMessage also fits in nicely. Apple doesn't have to operate a huge instant messaging network, which undoubtedly is not making a profit, but they do, because having one entry to secure, encrypted communication fits well with the Apple Experience. iMessage did so well at abstracting the ugly details of encryption that few people even think about that that's what the blue bubble is actually about, it more or less only means your message is end to end encrypted. As a side effect you can also send full resolution images (and more), but that's in no way unique to iMessage.
The MacBook Air is not a luxury device. That meme is out of date
I can't buy a MacBook Air for less than $999, and that's for a model with 8GB RAM, an 8-core CPU and 256GB SSD. The equivalent (based on raw specs) in the PC world runs for $300 to $500.
How is something that is twice as expensive as the competition not a luxury device?
EDIT: Because there's repeated confusion in the replies: I am not saying that a MacBook Air is not objectively a better device. I'm saying it is better by metrics that fall strictly into the "luxury" category.
Better build quality, system-on-a-chip, better OS, better battery life, aluminum case—all of these are luxury characteristics that someone who is looking for a functional device that meets their needs at a decent price won't have as dealbreakers.
Curious what criteria you're using for using for qualifying luxury. It seems to me that materials, software, and design are all on par with other more expansive Apple products. The main difference is the chipset which I would argue is on an equal quality level as the pro chips but designed for a less power hungry audience.
Maybe for you, but I still see sales guys who refuse working on WinTel where basically what the do is browse internet and do spreadsheets - so mainly just because they would not look cool compared to other sales guys rocking MacBooks.
Their primary business is transitioning to selling services and extracting fees. It's their primary growth
Hey, I'm way ahead of Apple. I sell my services to my employer and extract fees from them. Do you extract fees too?
So the new iPad & M4 was just some weekend project that they shrugged and decided to toss over to their physical retail store locations to see if anyone still bought physical goods eh
> The promise of privacy
I have very little faith in apple in this respect.
For clarity, just install little snitch on your machine, and watch what happens with your system. Even without being signed in with an apple id and everything turned off, apple phones home all the time.
You can block 17.0.0.0 at the router, opening up only the notification servers. CDNs are a bit harder, but can be done with dnsmasq allow/deny of wildcard domains. Apple has documentation on network traffic from their devices, https://support.apple.com/en-us/101555
Their 2nd largest revenue source (at ... 20-25%, below only the iphone) is software services.
iCloud, App store revenue, apple tv, and so on
Why would all services be grouped together in a single category when all hardware is not? iCloud and Apple TV are very different.
As a privacy professional for many, many years this is 100% correct. Apple wouldn’t be taking billions from Google for driving users to their ad tracking system, they wouldn’t give the CCP access to all Chinese user data (and maybe beyond), and they wouldn’t be on-again-off-again flirting with tailored ads in Apple News if privacy was a “human right”.
(FWIW my opinion is it is a human right, I just think Tim Cook is full of shit.)
What Apple calls privacy more often than not is just putting lipstick on the pig that is their anticompetitive walled garden.
Pretty much everybody in SV who works in privacy rolls their eyes at Apple. They talk a big game but they are as full of shit as Meta and Google - and there’s receipts to prove it thanks to this DoJ case.
Apple want to sell high end hardware. On-device computation is a better user experience, hands down.
That said, Siri is utter dogshit so on-device dogshit is just faster dogshit.
any private guides for todays smartphone user?
> The promise of privacy is one way in which they position themselves, but I would not bet the bank on that being true forever.
Indeed.
Privacy starts with architectural fundamentals that are very difficult to retrofit...
If a supplier of products has not built the products this way, it would be naive to bet bank or farm on the supplier. Even if there were profound motivation to retrofit.
Add to this the general tendency of the market to exploit its customers.
>The promise of privacy is one way in which they position themselves, but I would not bet the bank on that being true forever.
They failed with their ad-business so this is a nice pivot. I'll take it, I'm not usually a cheerleader for Apple, but I'll support anyone who can erode Google's surveillance dominance.
> The promise of privacy is one way in which they position themselves, but I would not bet the bank on that being true forever.
Everybody so quick to forget Apple was/is part of PRISM like any other company.
> The promise of privacy is one way in which they position themselves, but I would not bet the bank on that being true forever.
There are a ton of us out here that consciously choose Apple because of their position on privacy. I have to imagine they know how many customers they'll lose if they ever move on this, and I want to believe that it's a large enough percentage to prevent it from happening. Certainly my circle is not a useful sample, but the Apple people in it are almost all Apple people because of privacy.
When I was at Apple for a short time, there was a small joke I hear from the ex-amazonians there who would say "What's the difference between an Apple software engineer and an Amazon software engineer? The Amazon engineer will spin up a new service on AWS. An Apple engineer will spin up a new app". Or something along those lines. I forget the exact phrasing. It was a joke that Apple's expertise is in on-device features, whereas Amazon thrives in the cloud services world.
What would a Google engineer do? Write a design doc or sunset a service?
Every company is selling one thing or another, and nothing is going to last forever. I really fail to see what, except for generic negativity, your comment adds to anything.
It’s not even true now. Apple (and by the extension USG and CCP) can read ~every iMessage. The e2ee is backdoored.
What do you mean luxury? Samsung produces phones that are 2x more expensive and are more luxurious.
E.g. Apple still doesn't offer a flipphone that turns into a tablet when you open it.
That is not where Apple's growth has been for quite some time, it's services. And because of that I'll be awaiting the economic rental strategy to come at any moment.
> The promise of privacy is one way in which they position themselves
Yep, the claim to prioritize privacy is a key selling point for them
Nearly 1/4th of their revenue is Apple services, and its their primary source of revenue growth to feed the beast.
For all their competitors it’s not true right now.
I wouldn't bank on that being true forever after 2012. A corporation is goal are vastly determined by the corporate structure.
Privacy is a luxury today so yeah selling luxury hardware and promise of privacy form a coherent business position.
If you average out it is quite not expensive. Unlike android phone or let us not talk about android tablet.
The important is that a good starting ai learning platform is what … most apple does not touch those price.
Hence with privacy it is a good path.
You do not want communism even if it is not expensive in the short term.
Nothing is true forever. Google wasn’t evil forever, Apple won’t value privacy forever.
Until we figure out how to have guarantees of forever, the best we can realistically do is evaluate companies and their products by their behavior now weighted by their behavior in the past.
Privacy is challenging to achieve
>The promise of privacy is one way in which they position themselves, but I would not bet the bank on that being true forever.
Everyone seems to have forgotten about the Celebrity iCloud photo leak.
As soon as the privacy thing goes away, I'd say a major part of their customer base goes away too. Most people use android so they don't get "hacked" if Apple is doing the hacking, I'd just buy a cheaper alternative.
Maybe true for a lot of the HN population, but my teenagers are mortified by the idea of me giving them android phones because then they would be the pariahs turning group messages from blue to green.
And just to elaborate on this: it's not just snobbery about the color of the texts, for people who rely on iMessage as their primary communication platform it really is a severely degraded experience texting with someone who uses Android. We Android users have long since adapted to it by just avoiding SMS/MMS in favor of other platforms, but iPhone users are accustomed to just being able to send a video in iMessage and have it be decent quality when viewed.
Source: I'm an Android user with a lot of iPhones on my in-laws side.
Be aware that iPhones degrade MMS more than necessary and the only reason seems to be to punish Android use.
No, it really is just snobbery.
I’m in Europe and everyone uses WhatsApp, and while Android does gave higher share over here, iPhone still dominate the younger demographics. I’m not denying blue/green is a factor in the US but it’s not even a thing here. It’s nowhere near the only it even a dominant reason iPhones are successful with young people.
This is a sad state of affairs.
Interesting that some people would take that as an Apple problem and others would take it as a Google problem
Who’s at fault for not having built-in messaging that works with rich text, photos, videos, etc?
Google has abandoned more messaging products than I can remember while Apple focused on literally the main function of a phone in the 21st century. And they get shit for it
Most of the world doesn't care about built in. Apple decided against iMessage on Android for lock in. Android had RCS in 2019.
Apple get shit for it because they made it a proprietary protocol for which clients are not available on anything except their own hardware. The whole point of messaging is that it should work with all my contacts, not just those who drank the Apple-flavored Kool-Aid.
Google’s protocol is proprietary too - their encryption extension makes it inaccessible for anyone else and google will not partner or license (companies have tried).
RCS as currently implemented is iMessage but with a coat of google paint. There is no there there.
Google should get plenty of shit too for closing down GTalk in the first place. It's not an either-or. Big tech in general hates open protocols and interoperability for consumer stuff; Apple is just the most egregious offender there.
Snobbery is an expensive pastime.
The social implications of using Android phones...
At least here in Brazil, I've never heard such arguments.
Seems even more unlikely for non technical users.
It's just their latest market campaign, as far as I can tell. The vast majority of people buy iPhones because of the status it gives.
I never understood this argument.
Theres no “status” to a brand of phone when the cheapest point of entry is comparable and the flagship is cheaper than the alternative flagship.
Marketing in most of europe is chiefly not the same as the US though so maybe its a perspective thing.
I just find it hard to really argue “status” when the last 4 iPhone generations are largely the same and cheaper than the Samsung flagships.
At Elgiganten a Samsung S24 Ultra is 19,490 SEK[0].
The most expensive iPhone 15 pro max is 18,784 SEK at the same store[1].
[0]: https://nya.elgiganten.se/product/mobiler-tablets-smartklock...
[1]: https://nya.elgiganten.se/product/mobiler-tablets-smartklock...
My take is that it's like a fashion accessory. People buy Gucci for the brand, not the material or comfort.
Rich people ask for the latest most expensive iPhone even if they're only going to use WhatsApp and Instagram on it. It's not because of privacy or functionality, it's simply to show off to everyone they can purchase it. Also to not stand out within their peers as the only one without it.
As another content said: it's not an argument, it's a fact here.
I have an iPhone so I guess I qualify as a rich person by your definition. I am also a software engineer. I cannot state enough how bogus that statement is. I've used both iPhone and Android, and recent flagships. iPhone is by far the easiest one to use. Speaking in more objective terms, iPhones have a coherent UI which maintains its consistency both throughout the OS and over the years. They're the most dumbed down phones and easiest to understand. I recommend iPhone to all my friends and relatives.
There's obviously tons of people who see iPhone as a status item. They're right, because iPhone is expensive and only the rich can buy them. This doesn't mean iPhone is not the best option out there for a person who doesn't want to extensively customize his phone and just use it.
Yes, by pure statistics you are probably rich compared to everyone else. The average software developer salary is way bigger than the average salary for the entirety of the US. Let's not even mention compared to the rest of the world.
Sure, some people pick up the iPhone because they like the specs, or the apps, or whatever else. That's why I said the majority picks it up for status, not all. But keep in mind nobody's judging the iPhone's specs or capabilities here. We're talking about why people buy it.
Ask any teenager why they want an iPhone. I'd be very surprised if even one said it's because of privacy. It's because of the stupid blue bubble, which is a proxy for status.
I'm pretty sure if Apple released the same phone again with a new name and design, people would still buy it. For the majority, it's not because of features, ease of use, specs, etc: it's status.
> iPhone and Android, and recent flagships. iPhone is by far the easiest one to use. Speaking in more objective terms, iPhones have a coherent UI
It’s not about if you’ve used android, it’s about if you’ve beeen poor-ish or stingy
To some people those are luxuries- the most expensive phone they buy is a mid-range Motorola for $300 with snapdragon 750g or whatever. They run all the same apps after all, they take photos.
iPhones are simply outside of your budget.
Its not an argument, just ask why people lust after the latest iPhones in poor countries. They do it because they see rich people owning them. Unless you experience that, you won't really understand it.
The cheapest point of entry is absolutely not comparable. The cheapest new iPhone on apple.com is $429. The cheapest new Samsung on samsung.com is $199 (They do have a phone listed for $159, but it's button says "Notify Me").
Granted, you may have been leaning very heavily on the dictionary definition of "comparable", in that the two numbers are able to be compared. However, when the conclusion of that comparison is "More than twice the price", I think you should lead with that.
Keep in mind, the iPhone SE is using a 3 year old processor, the Samsung A15 was released 5 months ago with a brand new processor.
Is this brand new cpu faster or more energy efficient?
Yes.
According to various sites, the Mediatek Dimensity 6100+ is a 6nm update to a core that was released 3 years ago (Dimensity 700 on a 7nm). It's 5-10% faster, likely due to the update from 7 to 6nm, as the cores are the same and run at the same speed. It contains an updated bluetooth chipset (from 5.1 to 5.2) and supports a larger max camera. The camera on the A15 is well below the max size of the previous chipset, however, the increased camera bandwidth should ensure that the camera feels snappier (a common complaint on low-end phones). The process improvement should increase efficiency as well, however, there are not benchmarks that are able to test this.
It's fashion and the kids are hip. But there is an endless void of Apple haters here who want to see it burn. They have nothing in common with 99.9% of the customer base.
I was thinking about this for a while, the problem is not about apple, it’s the fact that the rest of the industry is gutless, and has zero vision or leadership. Whatever Apple does, the rest of the industry will follow or oppose - but will be defined by it.
It’s like how people who don’t like US and want nothing to do with US still discuss US politics, because it has so much effect everywhere.
(Ironically no enough people discuss China in any coherent level of understanding)
You're absolutely right, I'm so glad that Apple was the first company to release a phone with a touch screen, or a phone with an app store, or a smart watch or a VR headset.
Apple doesn't release new products, they wait until the actual brave and innovating companies have done the exploration and then capitalize on all of their learnings. Because they are never the first movers and they have mountains of cash, they're able to enter the market without the baggage of early adopters. They don't have to worry about maintaining their early prototypes.
Apple doesn't innovate or show leadership, they wait until the innovators have proven that the market is big enough to handle Apple, then they swoop in with a product that combines the visions of the companies that were competing.
Apple is great at what they do, don't get me wrong. And swooping in when the market is right is just good business. Just don't mistake that for innovation or leadership.
They famously had a standoff with the US gov't over the Secure Enclave.
Marketing aside, all indications point to the iOS platform being the most secure mobile option (imo).
This is a prejudiced take. Running AI tasks locally on the device definitely is a giant improvement for the user experience.
But not only that, Apple CPUs are objectively leagues ahead of their competition in the mobile space. I am still using a IPhone released in 2020 with absolutely no appreciable slow down or losses in perceived performance. Because even a 4 years old IPhone still has specs that don't lag behind by much the equivalent Android phones, I still receive the latest OS updates, and because frankly, Android OS is mess.
If I cared about status, I would have changed my phone already for a new one.
> I am still using a IPhone released in 2020 with absolutely no appreciable slow down or losses in perceived performance.
My Pixel 4a here is also going strong, only the battery is slowly getting worse. I mean, it's 2024, do phones really still get slow? The 4a is now past android updates, but that was promised after 3 years. But at 350 bucks, it was like 40% less than the cheapest iPhone mini at that time.
> I mean, it's 2024, do phones really still get slow?
Hardware is pretty beefed up but bloat keeps on growing, that is slowing things down considerably.
what about security updates?
> I am still using a IPhone released in 2020 with absolutely no appreciable slow down or losses in perceived performance.
Only because Apple lost a lawsuit otherwise they'd have kept intentionally slowing it down.
This has been debunked.
.... by Apple.
See https://en.wikipedia.org/wiki/Batterygate
Right. "By Apple".
Apple says it made these changes for other reasons, honestly, truly. And if it happened to have the same effect, then that was unfortunate, and unintended.
Only Apple really knows. But there was a slew of changes and reversals following the drama. "Oh, we'll implement notifications now", "Oh, we'll change the peak performance behavior", and "we will change and add additional diagnostics to make sure issues are battery related" certainly has a feel for a bunch of ex post facto rationalization of several things that seem, to me, that if it was truly a battery thing all along, would have been functional requirements.
>Apple CPUs are objectively leagues ahead of their competition in the mobile space
This is a lie. The latest Android SoCs are just as powerful as the A series.
>Because even a 4 years old IPhone still has specs that don't lag behind by much the equivalent Android phones, I still receive the latest OS updates, and because frankly, Android OS is mess.
Samsung and Google offer 7 years of OS and security updates. I believe that beats the Apple policy.
> Samsung and Google offer 7 years of OS and security updates. I believe that beats the Apple policy.
On the second part:
https://en.wikipedia.org/wiki/IPadOS_version_history
The last iPads to stop getting OS updates (including security, to be consistent with what Samsung and Google are pledging) got 7 and 9 years of updates each (5th gen iPad and 1st gen iPad Pro). The last iPhones to lose support got about 7 years each (iPhone 8 and X). 6S, SE (1st), and 7 got 9 and 8 years of OS support with security updates. The 5S (released in 2013) last got a security update in early 2023, so also about 9 years, the 6 (2014) ended at the same time so let's call it 8 years. The 4S, 2011, got 8 years of OS support. 5 and 5C got 7 and 6 years of support (5C was 5 in a new case, so was always going to get a year less in support).
Apple has not, that I've seen at least, ever established a long term support policy on iPhones and iPads, but the numbers show they're doing at least as well as what Samsung and Google are promising to do, but have not yet done. And they've been doing this for more than a decade now.
EDIT:
Reworked the iOS numbers a bit, down to the month (I was looking at years above and rounding, so this is more accurate). iOS support time by device for devices that cannot use the current iOS 17 (so the XS and above are not counted here) in months:
The average is 72.5 months, just over 6 years. If we knock out the first 2 phones (both have somewhat justifiable short support periods, massive hardware changes between each and their successor) the average jumps to just shy of 79 months, or about 6.5 years.The 8 and X look like regressions, but their last updates were just 2 months ago (March 21, 2024) so still a good chance their support period will increase and exceed the 7 year mark like every model since the 5S. We'll have to see if they get any more updates in November 2024 or later to see if they can hit the 7 year mark.
Google can't keep a product alive. You're welcome to believe on their promises of extended support after all those years of shitty updates policies.
>The last iPads to stop getting OS updates (including security, to be consistent with what Samsung and Google are pledging) got 7 and 9 years of updates each (5th gen iPad and 1st gen iPad Pro). The last iPhones to lose support got about 7 years each (iPhone 8 and X). 6S, SE (1st), and 7 got 9 and 8 years of OS support with security updates. The 5S (released in 2013) last got a security update in early 2023, so also about 9 years, the 6 (2014) ended at the same time so let's call it 8 years. The 4S, 2011, got 8 years of OS support. 5 and 5C got 7 and 6 years of support (5C was 5 in a new case, so was always going to get a year less in support).
These are very disingenuous numbers that don't tell the complete story. An iPhone 7 getting a single critical security patch does not take into account the hundreds of security patches it did not receive when it stopped receiving support. It received that special update because Apple likely was told or discovered it was being exploited in the wild.
Google and Samsung now offer 7 years of OS upgrades and 84 months of full security patches. Selectively patching a phone that is out of the support window with a single security patch does not automatically increase its EOL support date.
I look forward to these vendors delivering on their promises, and I look forward to Apple perhaps formalizing a promise with less variability for future products.
Neither of these hopes retroactively invalidates the fact that Apple has had a much better track record of supporting old phone models up to this point. Even if you do split hairs about the level of patching some models got in their later years, they still got full iOS updates for years longer than most Android phones got any patches at all, regardless of severity.
This is not an argument that somehow puts Android on top, at best it adds nuance to just how much better iOS support has been up to this point.
Let's also not forget that if Apple wasn't putting this kind of pressure on Google, they wouldn't have even made the promise to begin with, because it's clear how long they actually care to support products with no outside pressure.
I agree. This is the type of competition I like to see between these two companies. In the end the consumer wins regardless of which one you buy. Google has also promised 10 years of Chromebook support, so they've clearly got the message on the importance of supporting hardware much longer than a lot of people would use them for.
They made that pledge for the Pixel 8 (2023). Let's revisit this in 2030 and see what the nature of their support is at that point and how it compares to Apple's support for iPhone devices. We can't make a real comparison since they haven't done anything yet, only made promises.
What we can do today is note that Apple never made a promise, but did provide very long security support for their devices despite that. They've already met or come close to the Samsung/Google pledge (for one device) on almost half their devices, and those are all the recent ones (so it's not a downward trend of good support then bad support, but rather mediocre/bad support to improving and increasingly good support).
Another fun one:
iPhone XS was released in September 2018, it is on the current iOS 17 release. In the absolute worst case of it losing iOS 18 support in September, it will have received 6 full years of support in both security and OS updates. It'll still hit 7 years (comfortably) of security updates. If it does get iOS 18 support in September, then Apple will hit the Samsung/Google pledge 5 years before Samsung/Google can even demonstrate their ability to follow through (Samsung has a chance, but Google has no history of commitment).
I have time to kill before training for a century ride:
Let's ignore everything before iPhone 4S, they had short support periods that's just a fact and hardly worth investigating. This is an analysis of devices released in 2011 and later, when the phones had, mostly, matured as a device so we should be expecting longer support periods. These are the support periods when the phones were able to run the still-current iOS versions, not counting later security updates or minor updates but after the major iOS version had been deprecated. As an example, for the iPhone 4S it had support from 2011-2016. In 2016 its OS, iOS 9, was replaced by iOS 10. Here are the numbers:
The 6S is a bit of an outlier, hitting 7 years of full support running the current iOS. 5C and SE(1st) both got less total support, but their internals were the same as prior phones and they lost support at the same time as them (this is reasonable, if annoying, and does drag down the average). So Apple has clearly trended towards 6 years of full support, the XS (as noted above) will get at least 6 years of support as of this coming September. We'll have to see if they can get it past the 7 year mark, I know they haven't promised anything but the trend suggests they can.Sure. They also pledged to support Chromebooks for 10 years. My point being is that I don't think they'll be clawing back their new hardware support windows anytime soon. Their data indicates that these devices were used well beyond their initial support window metrics so it was in their, and their users, best interest to keep them updated as long as they possibly could. 3 years of OS updates and 4 years of security updates was always the weak link in their commitment to security. And this applies to all of their devices including the A series - something I don't see other Android OEM's even matching.
BTW, my daily driver is an iPhone 13 and I was coming from an iPhone X. So I'm well aware of the incredible support Apple provides its phones. Although, I would still like to see an 8+ year promise from them.
Strangle Android 14 seems to not be available for s20 phone which was released in 2020?
Or am I mistaken here?
The vast majority of people don’t. They buy because the ecosystem works. Not sure how I get status from a phone that nobody knows I have. I don’t wear it on a chain.
Could it possibly be different in Brazil?
iPhones are not ubiquitous here, and they're way more expensive than other options.
Yes, Apple has cultivated a certain brand image in some ways.
Apple only pivoted into the “privacy” branding relatively recently [1] and I don't think that many people came for that reason alone. In any case, most are now trapped into the walled garden and the effort to escape is likely big enough. And there's no escape anyway, since Google will always make Android worse in that regard…
[1] in 2013 they even marketed their “eBeacon” technology as a way for retail stores to monitor and track their customers which…
Ca 2013 was the release of the Nexus 5, arguably the first really usable android smartphone.
Privacy wasn’t really a concern because most people didn’t have the privacy eroding device yet. In the years following the Nexus 5 is where smartphones went into geometric growth and the slow realization of the privacy nightmare became apparent
Imho I was really excited to get a Nexus 4 at the time, just a few short years later the shine wore off and I was horrified at the smartphone enabled future. And I have a 40 year background in computers and understand them better than 99 out of 100 users – if I didn’t see it, I can’t blame them either
> Ca 2013 was the release of the Nexus 5, arguably the first really usable android smartphone.
What a strange statement. I was late to the game with a Nexus S in 2010, and it was really usable.
Define usable. Imho before Nexus 4 everything was crap, Nexus 4 barely was enough (4x1.4 GHz), Nexus 5 (4x2.2GHz) plus software at the time (post-kitkat) was when it was really ready for mainstream
I'd say from my experience the average Apple users care less about privacy then the general public. It's a status symbol first and foremost 99% of what people do on their phones is basically identical on both platforms at this point.
is that still the case?
I think it will be a winning strategy. Lag is a real killer for LLMs.
I think they'll have another LLM on a server (maybe a deal for openai/gemini) that the one on the device can use like ChatGPT uses plugins.
But on device Apple have a gigantic advantage. Rabbit and Humane are good ideas humbled by shitty hardware that runs out of battery, gets too hot, has to connect to the internet to do literally anything.
Apple is in a brilliant position to solve all those things.
I hope they announce something good at WWDC
I run a few models (eg Llama3:8b) on my 2023 MacBook Air, and there is still a fair bit of lag and delay, compared to a hosted (and much larger) model like Gemini. A large source of the lag is the initial loading of the model into RAM. Which an iPhone will surely suffer from.
Humane had lag and they used voice chat which is a bad UX paradigm. VUI is bad because it adds lag to the information within the medium. Listening to preambles and lists are always slower than a human eyes ability to scan a page of text. Their lag is not due to LLMs, which can be much faster than whatever they did.
We should remind ourselves that an iPhone can likely suffer similar battery and heat issues - especially if it’s running models locally.
Humane's lag feels down to just bad software design too, it almost feels like a two stage thing is happening like it's sending your voice or transcription up to the cloud, figuring out where it needs to go to get it done, telling the device to tell you its about to do that then finally doing it. E.g
User: "What is this thing?"
Pin: "I'll have a look what that is" (It feels this response has to come from a server)
Pin: "It's a <answer>" (The actual answer)
We're still a bit away from iPhone running anything viable locally, even small models today you can almost feel the chip creaking under the load they're incurring on it and the whole phone begins to choke.
> Lag is a real killer for LLMs.
I'm curious to hear more about this. My experience has been that inference speeds are the #1 cause of delay by orders of magnitude, and I'd assume those won't go down substantially on edge devices because the cloud will be getting faster at approximately the same rate.
Have people outside the US benchmarked OpenAI's response times and found network lag to be a substantial contributor to slowness?
of course not, it's just text. people here are just spitballing.
groq is way better for inference speeds btw
Having used groq and other fast LLM services a fair bit, lag seems negligible. You're literally just passing text at close to the speed of light.
* when you have a good internet connection
** when you live in the USA
> * when you have a good internet connection
Or at least, a good enough internet connection to send plaintext.
> * when you live in the USA
Even from Australia to USA is just ~300ms of latency for first token and then the whole thing can finish in ~1s. And making that faster doesn't require on-device deployment, it just requires a server in Australia - which is obviously going to be coming if it hasn't already for many providers.
There really isn't enough emphasis on the downsides of server side platforms.
So many of these are only deployed in US and so if you're say in country Australia not only do you have all your traffic going to the US but it will be via slow and intermittent cellular connections.
It makes using services like LLMs unusably slow.
I miss the 90s and having applications and data reside locally.
Unusably slow? It's like 0.3 seconds to first token and then pretty much all of the tokens can follow within a second.
I find it hard to understand the edge usecase for text-based models.
Even in Australia is the LLM lag to a server noticable?
Generally an LLM seems to take about 3s or more to respond, and the network delay to the US is a couple of hundred milliseconds.
The network delay seems minimal compared to the actual delay of the LLM.
I wonder if BYOE (bring your own electricity) also plays a part in their long term vision? Data centres are expensive in terms of hardware, staffing and energy. Externalising this cost to customers saves money, but also helps to paint a green(washing) narrative. It's more meaningful to more people to say they've cut their energy consumption by x than to say they have a better server obselesence strategy, for example.
Apple has committed that all of its products will be carbon-neutral - including emissions from charging during their lifetime - by 2030. The Apple Watch is already there.
https://www.apple.com/newsroom/2023/09/apple-unveils-its-fir...
From your link:
> "Apple defines high-quality credits as those from projects that are real, additional, measurable, and quantified, with systems in place to avoid double-counting, and that ensure permanence."
Apple then pledged to buy carbon credits from a company called Verra. In 2023, an investigation found that more than 90% of Verra's carbon credits are a sham. Notably, Apple made their pledge after the results of this investigation were known - so much for their greenwashing.
https://www.theguardian.com/environment/2023/jan/18/revealed...
I wish we had gotten a resolution on whether that report from the Guardian was correct. Regardless, Apple says that credits make up only about 25% of their reduced emissions.
That is an interesting angle to look at it from. If they're gonna keep pushing this they end up with a strong incentive to make the iPhone even more energy efficient, since users have come to expect good/always improving battery life.
At the end of the day, AI workloads in the cloud will always be a lot more compute effective however, meaning lowered combined footprint. However, in the server based model, there is more incentive to pre-compute (waste inference) things to make them appear snappy on device. Analogous would be all that energy spent doing video encoding for YouTube videos that never get watched. Although, it's "idle" resources for budgeting purposes.
I’m not sure it’s that (benched pointed out their carbon commitment) as simple logistics.
Apple doesn’t have to build the data centers. Apple doesn’t have to buy the AI capacity themselves (even if from TSMC for Apple designed chips). Apple doesn’t have to have the personnel for the data centers or the air conditioning. They don’t have to pay for all the network bandwidth.
There are benefits to the user to having the AI run on their own devices in terms of privacy and latency as mentioned by the GP.
But there are also benefits to Apple simply because it means it’s no longer their resources being used up above and beyond electricity.
I keep reading about companies having trouble getting GPUs from the cloud providers and that some crypto networks have pivoted to selling GPU access for AI work as crypto profits fall.
Apple doesn’t have to deal with any of that. They have underused silicon sitting out there ready to light up to make their customers happy (and perhaps interested in buying a faster device).
I agree with everything you said but the TSMC bit. They are quite literally competing with NVidia et al for fab space for customers chips. Sure they get the AI bits built-in to existing products but surely they’re bigger/more expensive to manufacture and commit from TSMC because of it.
I think we’re on the same page.
I was trying to say that there was still a cost to using their own chips for server AI because they still had to pay to have them made so they weren’t “free” because they’re Apple products as opposed to buying nVidia parts.
You’re right, there is a cost to them to put the AI stuff on end user chips too since die space isn’t free and extra circuits mean fewer chips fit per wafer.
Why would it be green washing? Aren’t their data centers and commercial operations run completely on renewable energy?
If you offload data processing to the end user, then your data center uses less energy on paper. The washing part is that work is still being done and spending energy, just outside of the data center.
Which honestly is still good for the environment to have the work distributed across the entire electricity grid.
That work needs to be done anyways and Apple is doing it in the cleanest way possible. What’s an alternative in your mind, just don’t do the processing? That sounds like making progress towards being green. If you’re making claims of green washing you need to be able to back it up with what alternative would actually be “green”.
I didn't make any claims, I just explained what the parent was saying. There could be multiple ways to make it more green: one being not doing the processing, or another perhaps just optimizing the work being done. But actually, no, you don't need a viable way to be green in order to call greenwashing "greenwashing." It can just be greenwashing, with no alternative that is actually green.
> Which honestly is still good for the environment to have the work distributed across the entire electricity grid.
This doesn't make any sense.
> If you’re making claims of green washing you need to be able to back it up with what alternative would actually be “green”.
Sometimes there isn't an alternative. In which case you don't get to look green, sorry. The person critiquing greenwashing doesn't need to give an alternative, why would that be their job? They're just evaluating whether it's real or fake.
Though in this case using renewable energy can help.
> Sometimes there isn't an alternative. In which case you don't get to look green, sorry. The person critiquing greenwashing doesn't need to give an alternative, why would that be their job? They're just evaluating whether it's real or fake.
Baselessly calling every greening and sustainability effort “greenwashing”, especially when there’s practically no thought put into what the alternative might be, is trite and borderline intellectually dishonest. They don’t want to have a conversation about how it could be improved, they just want to interject “haha that’s stupid, corporations are fooling all of you sheeple” from their place of moral superiority. This shit is so played out.
> Baselessly calling every greening and sustainability effort “greenwashing”, especially when there’s practically no thought put into what the alternative might be
Baseless? The foundation of this accusation is rock solid. Offloading the exact same computation to another person so your energy numbers look better is not a greening or sustainability effort.
Fake green should always be called greenwashing.
You don't need to suggest an improvement to call out something that is completely fake. The faker doesn't get to demand a "conversation".
You've seen a bunch of people be incorrectly dismissive and decided that dismissiveness is automatically wrong. It's not.
For an extreme example, imagine a company installs a "pollution-preventing boulder" at their HQ. It's very valid to call that greenwashing and walk away. Don't let them get PR for nothing. If they were actually trying, and made a mistake, suggest a fix. But you can't fix fake.
> Baseless? The foundation of this accusation is rock solid. Offloading the exact same computation to another person so your energy numbers look better is not a greening or sustainability effort.
Yes, I consider it baseless for the following reasons:
- First, consider the hardware running in data centers, and the iDevices running at the edge – the iPhones, iPads and presumably Macs. There's a massive difference in power consumption between a data center full of GPUs, and whatever the equivalent might be in iDevices. Few chips come close to Apple's M-series in power usage.
- Second, Apple's commitment to making those devices carbon neutral by 2030; I'm unaware of any commitment to make cloud compute hardware carbon neutral, but I'll admit that I don't really keep up with that kind of hardware so I could be totally wrong there.
- Third, consider that an AI compute service (I'm not sure what you call it) like OpenAI is always running and crunching numbers in its data center, while the iDevices are each individually running only when needed by the user.
- Fourth, the people who own the iDevices may charge them using more sustainable methods than would power a data center. For example, Iowa – where I live – generates 62% of its energy from wind power and nearly two-thirds of its total energy from renewable resources [1], whereas California only gets 54% of its energy from renewable resources. Of course this cuts both ways, there are plenty of states or even countries that get most of their power from coal, like Ohio.
That said, neither of us have any real numbers on these things so the best either of us can do is be optimistic or pessimistic. But I'd rather do that and have a discussion about it, instead of dismiss it out of hand like everyone else does by saying "haha dumb, get greenwashed".
You're right that improvements don't need to be suggested to have a conversation about greening/greenwashing. My irritation lies more in the fact that it's almost a trope at this point that you can click into the comments on any HN story that mentions greening/sustainability, and there will be comments calling it fake greenwashing. I don't disagree that it's easy for a company to greenwash if they want to, but it's tiring to see everything called greenwashing without applying any critical thinking. Everyone wants to be so jaded about corporations that they'll never trust a word about it.
[1] Although this two-thirds total includes the bio-fuel ethanol, so I feel like it shouldn't be included.
1. Maybe, but wouldn't Apple want to use M-series chips to do this either way?
2. That's an interesting angle.
3. It's the same total amount, and both will go idle when there's less demand.
4. I think the average data center gets cleaner energy than the average house but I can't find a proper comparison so maybe that's baseless.
Also as far as I'm aware, inference takes significantly fewer resources when you can batch it.
> but it's tiring to see everything called greenwashing without applying any critical thinking
That does sound tiring, but in this particular case I think there was sufficient critical thinking, and it was originally brought up as just a possibility.
This is what I wanted to get across but you said it so much better.
> Which honestly is still good for the environment to have the work distributed across the entire electricity grid.
Sometimes, but parallelization has a cost. The power consumption from 400,000,000 iPhones downloading a 2gb LLM is not negligible, probably more than what you'd consume running it as a REST API on a remote server. Not to mention slower.
Downloading 2gb of anything on my iPhone via wifi from my in-home gigabit fiber barely puts a dent in my battery life let alone much time.
The random ads in most phone games are much worse on my battery life.
Yeah it's a shame that mobile games are shit when console and PC gaming gets taken so serious by comparison. If you want to blame that on developers and not Apple's stupid-ass policies stopping you from emulating real games, be my guest. That's a take I'd love to hear.
Keep downloadin' those ads. This is what Apple wants from you, a helpless and docile revenue stream. Think Different or stay mad.
Blame? Simply saying downloading 2gb isn’t the power consumption hit you seem to think it is.
Not much of a gamer anyway, just an observation when I tried a couple apparently dodgy games.
Not sure why your reply wasn’t related to your original comment. Felt rather knee-jerk reactionary to me instead. Oh well.
It makes sense for desktops but not for devices with batteries. I think Apple should introduce a new device for $5-10k that has 400GB of VRAM that all Macs on the network use for ML.
If you're on battery, you don't want to do LLM inference on a laptop. Hell, you don't really want to do transcription inference for that long - but would be nice not to have to send it to a data center.
The fundamental problem with this strategy is model size. I want all my apps to be privacy first with local models, but there is no way they can share models in any kind of coherent way. Especially when good apps are going to fine tune their models. Every app is going to be 3GB+
Foundation models will be the new .so files.
This would be interesting but also feels a little restrictive. Maybe something like LoRa could bridge the capability gap but if a competitor then drops a much more capable model then you either have to ignore it or bring it into your app.
(assuming companies won't easily share all their models for this kind of effort)
And fine tuning datasets will be compressed and sent rather than the whole model
You could always mix and match. Do lighter task on device and outsource to cloud if needed
You can do quite a lot with LoRA without having to replace all the weights.
Gemini nano is 1.8B 4 bit parameters, so a little under a GB. And hopefully each app won’t include a full copy of their models.
I don't think HN understands how important model distillation still is for federated learning. Hype >> substance ITT
>n case it is not abundantly clear by now: Apple's AI strategy is to put inference (and longer term even learning)
I'm curious: is anyone seriously using apple hardware to train Ai models at the moment? Obviously not the big players, but I imagine it might be a viable option for Ai engineers in smaller, less ambitious companies.
I like to think back to 2011 and paraphrase what people were saying: "Is anyone seriously using gpu hardware to write nl translation software at the moment?"
"No, we should be use cheap commodity abundantly available cpus and orchestrate then behind cloud magic to write our nl translation apps"
or maybe "no we should build purpose built high performance computing hardware to write our nl translation apps"
Or perhaps in the early 70s "is anyone seriously considering personal computer hardware to ...". "no, we should just buy IBM mainframes ..."
I don't know. Im probably super biased. I like the idea of all this training work breaking the shackles of cloud/mainframe/servers/off-end-user-device and migrating to run on peoples devices. It feels "democratic".
I remember having lunch with a speech recognition researcher who was using GPUs to train DNNs to do speech recognition in 2011. It really was thought of as niche back then. But the writing was on the wall I guess in the results they were getting.
AMD didn't read the wall, unfortunately.
I don't think of examples really apply, because it's more a question of being on "cutting edge" vs personal hardware.
For example, running a local model and access to the features of a larger more capable/cloud model are two completely different features therefore there is no "no we should do x instead".
I'd imagine that a dumber local model runs and defers to cloud model when it needs to/if user has allowed it to go to cloud. Apple could not compete on "our models run locally privacy is a bankable feature" alone imo, TikTok install base has shown us enough that users prefer content/features over privacy, they'll definitely still need SoA cloud based models to compete.
Apple are. Their “Personal Voice” feature fine tunes a voice model on device using recordings of your own voice.
An older example is the “Hey Siri” model, which is fine tuned to your specific voice.
But with regards to on device training, I don’t think anyone is seriously looking at training a model from scratch on device, that doesn’t make much sense. But taking models and fine tuning them to specific users makes a whole ton of sense, and an obvious approach to producing “personal” AI assistants.
[1] https://support.apple.com/en-us/104993
They already do some “simple” training on device. The example I can think of is photo recognition in the photo library. It likely builds on something else but being able to identify which phase is your grandma versus your neighbor is not done in Apple‘s cloud. It’s done when your devices are idle and plugged into power.
A few years ago it wasn’t shared between devices so each device had to do it themselves. I don’t know if it’s shared at this point.
I agree you’re not going to be training an LLM or anything. But smaller tasks limited and scope may prove a good fit.
Not really (I work on AI/ML Infrastructure at a well known tech company and talk regularly w/ our peer companies).
That said, inference on apple products is a different story. There's definitely interest in inference on the edge. So far though, nearly everyone is still opting for inference in the cloud for two reasons:
1. There's a lot of extra work involved in getting ML/AI models ready for mobile inference. And this work is different for iOS vs. Android 2. You're limited on which exact device models will run the thing optimally. Most of your customers won't necessarily have that. So you need some kind of fallback. 3. You're limited on what kind of models you can actually run. You have way more flexibility running inference in the cloud.
A cloud solution I looked at a few years ago could be replicated (poorly) in your browser today. In my mind the question has become one of determining when my model is useful enough to detach from the cloud, not whether that should happen.
Inference on the edge is a lot like JS - just drop a crap ton of data to the front end, and let it render.
Power for power, any thoughts on what mobile inference looks like vs doing it in the cloud?
Mobile can be more efficient. But you're making big tradeoffs. You are very limited in what you can actually run on-device. And ultimately you're also screwing over your user's battery life, etc.
Pytorch actually has surprisingly good support for Apple Silicon. Occasionally an operation needs to use CPU fallback but many applications are able to run inference entirely off of the CPU cores.
I’ve found it to be pretty terrible compared to CUDA, especially with Huggingface transformers. There’s no technical reason why it has to be terrible there though. Apple should fix that.
Yeah. It’s good with YOLO and Dino though. My M2 Max can compute Dino embeddings faster than a T4 (which is the GPU in AWS’s g4dn instance type).
MLX will probably be even faster than that, if the model is already ported. Faster startup time too. That’s my main pet peeve though: there’s no technical reason why PyTorch couldn’t be just as good. It’s just underfunding and neglect
t4's are like 6 years old
And there is a lot of work being done with mlx.
Yes, it can be more cost effective for smaller businesses to do all their work on Mac Studios, versus having a dedicated Nvidia rig plus Apple or Linux hardware for your workstation.
Honestly, you can train basic models just fine on M-Series Max MacBook Pros.
A decked out Mac Studio is like $7k for far less GPU power. I find that highly unlikely.
A non-decked out Mac Studio is a hell of a machine for $1999.
Do you also compare cars by looking at only the super expensive limited editions, with every single option box ticked?
I'd also point out that said 3 year old $1999 Mac Studio that I'm typing this on already runs ML models usefully, maybe 40-50% of the old 3000-series Nvidia machine it replaces, while using literally less than 10% of the power and making a tiny tiny fraction of the noise.
Oh, and it was cheaper. And not running Windows.
They are talking about training models, though. Run is a bit ambiguous, is that also what you mean?
No.
For training the Macs do have some interesting advantages due to the unified memory. The GPU cores have access to all of system RAM (and also the system RAM is ridiculously fast - 400GB/sec when DDR4 is barely 30GB/sec, which has a lot of little fringe benefits of it's own, part of why the Studio feels like an even more powerful machine than it actually is. It's just super snappy and responsive, even under heavy load.)
The largest consumer NVidia card has 22GB of useable RAM.
The $1999 Mac has 32GB, and for $400 more you get 64GB.
$3200 gets you 96GB, and more GPU cores. You can hit the system max of 192GB for $5500 on an Ultra, albeit it with the lessor GPU.
Even the recently announced 6000-series AI-oriented NVidia cards max out at 48GB.
My understanding is a that a lot of enthusiasts are using Macs for training because for certain things having more RAM is just enabling.
The huge amount of optimizations available on Nvidia and not available on Apple make the reduced VRAM worth it, because even the most bloated of foundation models will have some magical 0.1bit quantization technique be invented by a turbo-nerd which only works on Nvidia.
I keep hearing this meme of Mac's being a big deal in LLM training, but I have seen zero evidence of it, and I am deeply immersed in the world of LLM training, including training from scratch.
Stop trying to meme apple M chips as AI accelerators. I'll believe it when unsloth starts to support a single non-nvidia chip.
Yeah, and I think people forget all the time that inference (usually batch_size=1) is memory bandwidth bound, but training (usually batch_size=large) is usually compute bound. And people use enormous batch sizes for training.
And while the Mac Studio has a lot of memory bandwidth compared to most desktops CPUs, it isn't comparable to consumer GPUs (the 3090 has a bandwidth of ~936GBps) let alone those with HBM.
I really don't hear about anyone training on anything besides NVIDIA GPUs. There are too many useful features like mixed-precision training, and don't even get me started on software issues.
If you work for a company willing to shell out sure there are better options.
But for individual developers it’s an interesting proposition.
And a bigger question is: what if you already have (or were going to buy) a Mac? You prefer them or maybe are developing for Apple platforms.
Upping the chip or memory could easily be cheaper than getting a PC rig that’s faster for training. That may be worth it to you.
Not everyone is starting from zero or wants the fastest possible performance money can buy ignoring all other factors.
Agreed. Although inference is good enough on the Mac, there is no way I am training on them at all.
It's just more efficient to offload training to cloud Nvidia GPUs
But you get access to a very large amount of RAM for that price.
Don't attack me, I'm not disagreeing with you that an nVidia GPU is far superior at that price point.
I simply want to point out that these folks don't really care about that. They want a Mac for more reasons than "performance per watt/dollar" and if it's "good enough", they'll pay that Apple tax.
Yes, yes, I know, it's frustrating and they could get better Linux + GPU goodness with an nVidia PC running Ubuntu/Arch/Debian, but macOS is painless for the average science AI/ML training person to set up and work with. There are also known enterprise OS management solutions that business folks will happily sign off on.
Also, $7000 is chump change in the land of "can I get this AI/ML dev to just get to work on my GPT model I'm using to convince some VC's to give me $25-500 million?"
tldr; they're gonna buy a Mac cause it's a Mac and they want a Mac and their business uses Mac's. No amount of "but my nVidia GPU = better" is ever going to convince them otherwise as long as there is a "sort of" reasonable price point inside Apple's ecosystem.
What Linux setup do you recommend for 128GB of GPU memory?
Not all of us who own small businesses are out here speccing AMD Ryzen 9s and RTX 4090s for workstations.
You can't lug around a desktop workstation.
> a dedicated Nvidia rig
I am honestly shocked Nvidia has been allowed to maintain their moat with cuda. It seems like AMD would have a ton to gain just spending a couple million a year to implement all the relevant ML libraries with a non-cuda back-end.
AMD doesn’t really seem inclined toward building developer ecosystems in general.
Intel seems like they could have some interesting stuff in the annoyingly named “OneAPI” suite but I ran it on my iGPU so I have no idea if it is actually good. It was easy to use, though!
There are quite a few back and forth X/Twitter storms in teacups between George Hotz / tinygrad and the AMD management about opening up the firmware for custom ML integrations to replace CUDA but last I checked they were running into walls
I don't understand why you would need custom firmware. It seems like you could go a long way just implementing back-ends for popular ML libraries in openCL / compute shaders
smaller businesses have no business having a dedicated GPU rig of any kind
I don't think this is what you meant but it matches the spec: federated learning is being used by Apple to train models for various applications and some of that happens on device (iphones/ipads) with your personal data before its hashed and sent up to the mothership model anonymously.
https://www.technologyreview.com/2019/12/11/131629/apple-ai-...
Does one need to train an AI model on specific hardware, or can a model be trained in one place and then used somewhere else? Seems like Apple could just run their fine tuned model called Siri on each device. Seems to me like asking for training on Apple devices is missing the strategy. Unless of course, it's just for purely scientific $reasons like "why install Doom on the toaster?" vs doing it for a purpose.
It doesn’t require specific hardware; you can train a neural net with pencil and paper if you have enough time. Of course, some pieces of hardware are more efficient than others for this.
Yes, there’re a handful of apps that use the neural engine to fine tune models to their data.
Isn't Apple hardware too expensive to make that worthwhile?
For business-scale model work, sure.
But you can get an M2 Ultra with 192GB of UMA for $6k or so. It's very hard to get that much GPU memory at all, let alone at that price. Of course the GPU processing power is anemic compared to a DGX Station 100 cluster, but the mac is $143,000 less.
You want to buy a bunch of new equipment to do training? Yeah Mac’s aren’t going to make sense.
You want your developers to be able to do training locally and they already use Macs? Maybe an upgrade would make business sense. Even if you have beefy servers or the cloud for large jobs.
Yes, that would be great. But without the ability for us to verify this who's to say they won't use the edge resources(your computer and electricity) to process data(your data) and then send the results to their data center? It would certainly save them a lot of money.
They already do this. It's called federated learning and its a way for them to use your data to help personalize the model for you and also (to a much lesser extent) the global model for everyone whilst still respecting your data privacy. It's not to save money, it's so they can keep your data private on device and still use ML.
https://www.technologyreview.com/2019/12/11/131629/apple-ai-...
When you can do all inference at the edge, you can keep it disconnected from the network if you don't trust the data handling.
I happen to think they wouldn't, simply because sending this data back to Apple in any form that they could digest it is not aligned with their current privacy-first strategies. But if they make a device that still works if it stays disconnected, the neat thing is that you can just...keep it disconnected. You don't have to trust them.
Except that's an unreasonable scenario for a smart phone. It doesn't prove that the minute the user goes online it won't be egressing data willingly or not.
I don't disagree, although when I composed my comment I had desktop/laptop in mind, as I think genuinely useful on-device smartphone-AI is a ways of yet, and who knows what company Apple will be by then.
To use a proprietary system and not trust the vendor, you have to never connect it. That’s possible of course, but it seems pretty limiting, right?
You seem to be describing face recognition in Photos like it's a conspiracy against you. You'd prefer the data center servers looking at your data?
If you trust that Apple doesn't film you with the camera when you use the phone while sitting on the toilet. Why wouldn't you trust Apple now?
It would have to be a huge conspiracy with all Apples employees. And you can easily just listen to the network and see if they do it or not.
I find it somewhat hard to believe that wouldn’t be in contravention of some law or other. Or am I wrong?
Of course we can then worry that companies are breaking the law, but you have to draw the line somewhere… and what have they to gain anyway?
+1 The idea that it's on device, hence it's privacy-preserving is Apple's marketing machine speaking and that doesn't fly anymore. They have to do better to convince any security and privacy expert worth their salt that their claims and guarantees can be independently verified on behalf of iOS users.
Google did some of that on Android, which means open-sourcing their on-device TEE implementation, publishing a paper about it etc.
I'm all for running as much on the edge as possible, but we're not even close to being able to do real-time inference on Frontier models on Macs or iPads, and that's just for vanilla LLM chatbots. Low-precision Llama 3-8b is awesome, but it isn't a Claude 3 replacer, totally drains my battery, and is slow (M1 Max).
Multimodal agent setups are going to be data center/home-lab only for at least the next five years.
Apple isn't about to put 80GB on VRAM in an iPad for about 15 reasons.
The entire software stack is non-free and closed-source. This means you'd be taking Apple at their word on "privacy". Do you trust Apple? I wouldn't, given their track record.
What track record?
They fought the FBI over unlocking iPhones when they could have just quietly complied with the request. I'd say they have a decent track record.
They might have been thinking of the recently discovered hardware backdoor issue, CVE-2023-38606 (see also Operation Triangulation). There was surprisingly little reporting on it.
Discussion: https://news.ycombinator.com/item?id=38783112
Transcript of Security Now podcast episode discussing the issue: https://www.grc.com/sn/sn-955.htm
Just install little snitch and you'll see how MUCH gets sent back to the mothership. and that is just macos.
It isn't privacy if Apple knows.
They are the gatekeeper of your data for their benefit not yours.
Yes at the end its just some data representing user's trained model. Is there a contractual agreement with users that apple will never ever transfer a single byte of those, otherwise huge penalties will happen? If not, its pinky PR promise that sounds nice.
Apple publicly documents their privacy and security practices.
At minimum, laws around the world prevent companies from knowingly communicating false information to consumers.
And in many countries the rules around privacy are much more stringent.
I bet Boeing also has documentation about their security practices.
Talk is cheap and in Apple's case it's part of their PR.
But what does that have to do with the price of milk in Turkmenistan.
Because Boeing's issues have nothing to do with privacy or security and since they are not consumer facing have no relevance to what we are talking about.
What is wrong with Boeing's security?
> What is wrong with Boeing's security?
Too many holes.
For everyone else who doesn't understand what this means, he's saying Apple wants you to be able to run models on their devices, just like you've been doing on nvidia cards for a while.
I think he's saying they want to make local AI a first class, default, capability, which is very unlike buying a $1k peripheral to enable it. At this point (though everyone seems to be working on it), other companies need to include a gaming GPU in every laptop, and tablet now (lol), to enable this.
Awesome. I'm going to go tell my mom she can just pull her Nvidia card out of her pocket at the train station to run some models.
On second thought.. maybe it isn't "just like you've been doing on Nvidia cards for a while"
Yes is it completely clear. My guess is they do something like "Siri-powered shortcuts". Where you can ask it to do a couple things and it'll dynamically create a script and execute it.
I can see a smaller model trained to do that may work well enough, however, I've never seen any real working examples of this work, that rabit device is heading in that direction, but it's mostly vaporware now.
Pretty much my thoughts too. Going to have a model that’s smaller than 3B built in. The’ll have tokens that directly represent functions / shortcuts.
This comment is odd. I wouldn't say it is misleading, but it is odd because it borders on such definition.
> Apple's AI strategy is to put inference (and longer term even learning) on edge devices
This is pretty much everyone's strategy. Model distillation is huge because of this. This goes in line with federated learning. This goes in line with model pruning too. And parameter efficient tuning and fine tuning and prompt learning etc.
> This is completely coherent with their privacy-first strategy
Apple's marketing for their current approach is privacy-first. They are not privacy first. If they were privacy first, you would not be able to use app tracking data on their first party ad platform. They shut it off for everyone else but themselves. Apple's approach is walled garden first.
> Processing data at the edge also makes for the best possible user experience because of the complete independence of network connectivity
as long as you don't depend on graph centric problems where keeping a local copy of that graph is prohibitive. Graph problems will become more common. Not sure if this is a problem for apple though. I am just commenting in general.
> If (and that's a big if) they keep their APIs open to run any kind of AI workload on their chips
Apple does not have a good track record of this; they are quite antagonistic when it comes to this topic. Gaming on apple was dead for nearly a decade (and pretty much still is) because steve jobs did not want people gaming on macs. Apple has eased up on this, but it very much seems that if they want you to use their devices (not yours) in a certain way, then they make it expensive to do anything else.
Tbf, I don't blame apple for any of this. It is their strategy. Whether it works or not, it doesn't matter. I just found this comment really odd since it almost seemed like evangelism.
edit: weird to praise apple for on device training when it is not publicly known if they have trained any substantial model even on cloud.
Everyone’s strategy?
The biggest players in commercial AI models at the moment - OpenAI and Google - have made absolutely no noise about pushing inference to end user devices at all. Microsoft, Adobe, other players who are going big on embedding ML models into their products, are not pushing those models to the edge, they’re investing in cloud GPU.
Where are you picking up that this is everyone’s strategy?
> Where are you picking up that this is everyone’s strategy?
Read what their engineers say in public. Unless I hallucinated years of federated learning.
Also apple isn't even a player yet and everyone is discussing how they are moving stuff to the edge lol. Can't critique companies for not being on the edge yet when apple doesn't have anything out there.
8 days after this comment, Google announced Gemini Nano local inference for Chrome.
I believe at least Google is starting to do edge inference—take a look at the pixel 8 line-up they just announced. It doesn't seem to be emphasized as much, but the tensor G3 chip certainly has builtin inference.
Of course Google is. That's what Gemini Nano is for.
> This is pretty much everyone's strategy.
I think this is being too charitable on the state of "everyone". It's everyone's goal. Apple is actively achieving that goal, with their many year strategy of in house silicon/features.
> Apple is actively achieving that goal, with their many year strategy of in house silicon/features
So are other companies, with their many year strategy of actually building models that accessible to the public.
yet Apple is "actively" achieving the goal without any distinct models.
No. "On edge" is not a model existence limitation, it is a hardware capability/existence limitation, by definition, and by the fact that, as you point out, the models already exist.
You can already run those open weight models on Apple devices, on edge, with huge improvements on the newer hardware. Why is a distinct model required? Do the rumors appease these thoughts?
If others are making models, with no way to actually run them, that's not a viable "on edge" strategy, since it involves waiting for someone else to actually accomplish the goal first (as is being done by Apple).
> "On edge" is not a model existence limitation
It absolutely is. Model distillation will still be pertinent. And so will be parameter efficient tuning for edge training. I cannot emphasize more how important this is. You will need your own set of weights. If apple wants to use open weights, then sure. Ignore this. Don't seem like they want to long-term... And even if they use open weights, they will still be behind other companies have done model distillation and federated learning for years.
> Why is a distinct model required?
Ask apple's newly poached AI hires this question. Doesn't seem like you would take an answer from me.
> If others are making models, with no way to actually run them
Is this the case? People have been running distilled llamas on rPis with pretty good throughput.
> And even if they use open weights, they will still be behind other companies have done model distillation and federated learning for years.
I'm sorry, but we're talking about "on edge" here though. Those other companies have no flipping hardware to run it "on edge", in a "generic" way, which is the goal. Apple's strategy involves the generic.
> If apple wants to use open weights
This doesn't make sense. Apple doesn't dictate the models you can use with their hardware. You can already accelerate LLAMA with the neural engines. You can download the app right now. You can already deploy your models on edge, on their hardware. That is the success they're achieving. You cannot effectively do this on competitor hardware, with good performance, from "budget" to "Pro" lineup, which is a requirement of the goal.
> they will still be behind other companies have done model distillation and federated learning for years.
What hardware are they running it on? Are they taking advantage of Apple (or other) hardware in their strategy? Federated learning is an application of "on edge", it doesn't *enable* on edge, which is part of Apple's strategy.
> Ask apple's newly poached AI hires this question. Doesn't seem like you would take an answer from me.
Integrating AI in their apps/experience is not the same as enabling a generic "on edge", default, capability in all Apple devices (which they have been working towards for years now). This is the end goal for "on edge". You seem to be talking about OS integration, or something else.
> People have been running distilled llamas on rPis with pretty good throughput.
Yes, the fundamental limitation there being hardware performance, not the model, with that "pretty good" making the "pretty terrible" user experience. But, there's also nothing stopping anyone from running these distilled (a requirement of limited hardware) models on Apple hardware, taking advantage of Apples fully defined "on edge" strategy. ;) Again, you can run llamas on Apple silicon, accelerated, as I do.
> Those other companies have no flipping hardware to run it "on edge", in a "generic" way, which is the goal
Maybe? This is why I responded to:
> It's everyone's goal. Apple is actively achieving that goal
This is is the issue I found disagreeable. Other organizations and individual people are achieving that goal too. Google says GPT-Nano is going to device, and if the benchmarks are to be believed, if it runs at that level, their work so far is also actively achieving that goal. Meta has released multiple distilled models that people have already proven to run inference at the device level. It cannot be argued that meta is not actively achieving that goal either. They don't have to release the hardware because they went a different route. I applaud Apple for the M chips. They are super cool. People are still working on using them so Apple can realize that goal too.
So when you go to the statement that started this
> Apple's AI strategy is to put inference (and longer term even learning) on edge devices
Multiple orgs also share this. And I can't say that one particular org is super ahead of the others. And I can't elevate apple in that race because it is not clear that they are truly privacy-focused or that they will keep APIs open.
> You cannot effectively do this on competitor hardware, with good performance, from "budget" to "Pro" lineup, which is a requirement of the goal
Why do you say you cannot do this with good performance? How many tokens do you want for a device? Is 30T/s enough? You can do that on laptops running small mixtral.
> What hardware are they running it on? Are they taking advantage of Apple (or other) hardware in their strategy?
I don't know. I have nothing indicating necessarily apple or nvidia or otherwise. Do you?
> [Regarding the rest]
Sure, my point is that they definitely have an intent for bespoke models. And why I raised the point that not all computation will be feasible on edge for the time being. My point with what raised this particular line of inquiry is whether a pure edge experience truly enables the best user experience. And also why I raised the point about Apple's track record of open APIs. Which is why "actively achieving" is something that I put doubt on. And I also cast doubt on apple being privacy focused. Just emphasize tying it back to the reason I even commented.
A week after this comment, google announced gemini nano locally in chrome.
Has nothing to do with privacy, google is also pushing gemini nano to the device. The sector is discovering the diminishing returns of LLMs.
With the ai cores on phones they can cover your average user use cases with a light model without the server expense.
don't bother. apple's marketing seems to have won on here. i made a similar point only for people to tell me that apple is the only org seriously pushing federated learning.
> Processing data at the edge also makes for the best possible user experience because of the complete independence of network connectivity and hence minimal latency.
I know a shop who's doing this and it's a very promising approach. The ability to offload the costs of cloud GPU time is a tremendous advantage. That's to say nothing of the decreased latency, increased privacy, etc. The glaring downside is that you are dependent upon your users to be willing and able to run native apps (or possibly WASM, I'm not sure) on bleeding edge hardware. However, for some target markets (e.g. video production, photography, designers, etc.) it's a "safe" assumption that they will be using the latest and greatest Macs.
I've also been hearing people talk somewhat seriously about setting up their own training/inference farms using Macs because, at least for now, they're more readily available and cheaper to buy/run than big GPUs. That comes with a host of ops problems but it still may prove worthwhile for some use cases and addresses some of the same privacy concerns as edge computing if you're able to keep data/computation in-house.
I think these days everyone links their products with AI. Today even BP CEO linked his business with AI. Edge inference and cloud inference are not mutually exclusive choices. Any serious provider will provide both and the improvement in quality of services come from you giving more of your data to the service provider. Most people are totally fine with that and that will not change any time sooner. Privacy paranoia is mostly a fringe thing in consumer tech.
I agree. Apple has been on this path for a while, the first processor with a Neural Engine was the A11 in 2017 or so. The path didn’t appear to change at all.
The big differences today that stood out to me were adopting AI as a term (they used machine learning before) and repeating the term AI everywhere they could shove it in since that’s obviously what the street wants to hear.
That’s all that was different. And I’m not surprised they emphasized it given all the weird “Apple is behind on AI“ articles that have been going around.
Hey could i get a source on the BP stuff please. Just curious as I couldn't find anything on the interwebs
https://x.com/zerohedge/status/1787822478686851122
https://www.reddit.com/r/stocks/comments/1cmi5gj/bp_earnings...
I've been saying the same thing since ANE and the incredible new chips with shared ram, suddenly everyone could run capable local models - but then Apple decided to be catastrophically stingy once again putting ridiculous 8gb's of ram in these new iPads' and their new macbook air's destroying having a widespread "intelligent local siri" because now half the new generation can't run anything.
Apple is an amazing powerhouse but also disgustingly elitist and wasteful if not straight up vulgar in its profit motives. There's really zero idealism there despite their romantic and creative legacy.
There's always some straight idiotic limitations in their otherwise incredible machines, with no other purpose than to create planned obsolescence, "PRO" exclusivity and piles e-waste.
> to put inference on edge devices...
It will take a long time before you can put performant inference on edge device.
Just download one of the various open source large(st) langage model and test it on your desktop...
Compute power and memory and storage requirements are insane if you want decent result... I mean not just Llama gibberish.
Until such requirement are satisfied, distant model are the way to go, at least for conversational model.
Aside llm, AlphaGo would not run on any end user device, by a long shot, even if it is an already 'old' technology.
I think 'neural engine' on end user device is just marketing nonsense at this current state of the art.
On “privacy”: If Apple owned the Search app versus paying Google, and used their own ad network (which they have for App Store today), Apple will absolutely use your data and location etc to target you with ads.
It can even be third party services sending ad candidates directly to your phone and then the on-device AI chooses which is relevant.
Privacy is a contract not the absence of a clear business opportunity. Just look at how Apple does testing internally today. They have no more respect for human privacy than any of their competitors. They just differentiate through marketing and design.
Something they should be able to do now, but do not seem to, is to allow you to train Siri to recognize exactly your voice and accent. Which is to say, to take the speech-to-text model that is listening and putting it into the Siri integration API, to both be 99.99% accurate for your speech and to recognize you and only you when it comes to invoking voice commands.
It could, if it chose to, continue to recognize all voices but at the same time limit the things the non-owner could ask for based on owner preferences.
This is really easy to do: it's just an embedding of your voice. So typically like 10/30 sec max of your voice to configure this. You already do a similar setup for faceId. I agree with you, I don't understand why they don't do it.
"Which would be at odds with sending data up to the cloud for processing"
I don't think there is enough information here for that to be a true claim - it is possible to send the input used for inference, to the cloud, while computing the result locally. It is also possible to store the input used for inference while offline, and send that later when the device is online.
> Processing data at the edge also makes for the best possible user experience because of the complete independence of network connectivity and hence minimal latency.
That's one particular set of trade-offs, but not necessarily the best. Eg if your network connection and server processing speed is sufficiently faster than your local processing speed, the latency would be higher for doing it locally.
Local inference can also use more battery power. And you need a more beefy device, all else being equal.
Every embedded company is pushing ML at the edge with inference engines. Check out MLPerfTiny. They’ve been benchmarking all sorts of edge AI since 2019.
If they're doing inference on edge devices, one challenge I see is protecting model weights. If you want to deploy a proprietary model on an edge AI chip, the weights can get stolen via side-channel attacks [1]. Obviously this isn't a concern for open models, but I doubt Apple would go the open models route.
[1] https://spectrum.ieee.org/how-prevent-ai-power-usage-secrets
Nobody is taking particular care protecting weights for edge class models
Privacy can actually be reduced with on-device ai too. Now, without actually sending any data to iCloud apple can still have a general idea of what you’re doing. Imagine a state has a law that makes certain subjects illegal to discuss. They could compel apple to have their local AI detect that content and then to broadcast a ping in the AirTags network about the user and their location. No internet connection required on the target.
Also they don't have to pay either the capex or opex costs for training a model if they get user's devices to train the models
> This is completely coherent with their privacy-first strategy
Apple has never been privacy-first in practice. They give you the illusion of privacy but in reality it's a closed-source system and you are forced to trust Apple with your data.
They also make it a LOT harder than Android to execute your own MITM proxies to inspect what exact data is being sent about you by all of your apps including the OS itself.
You say that like open source isn't also an illusion of trust.
The reality is, there's too much to verify, and not enough interest for the "many eyeballs make all bugs shallow" argument.
We are, all of us, forced to trust, forced to go without the genuine capacity to verify. It's not great, and the best we can do is look for incentives and try to keep those aligned.
Open source is like democracy. Imperfect and easy to fuck up, but still by far the best thing available.
Apple is absolutism. Even the so called "enlightened" absolutism is still bad compared to average democracy.
Open Source is how that XZ hack got caught.
Selection bias — everyone only knows about the bugs that do get caught.
I was one of many who reported a bug in Ubuntu that went un-fixed for years, where the response smelled of nation-state influence: https://bugs.launchpad.net/ubuntu/+bug/1359836
And Log4Shell took about 8 years to notice: https://en.wikipedia.org/wiki/Log4Shell
And we have no idea how many such bugs are lurking in closed-source software.
Thanks for making my point for me.
You've missed my point if you think I've made yours for you.
I'm not saying closed source is a silver bullet.
I'm saying OSS also isn't a silver bullet, it doesn't find everything because there's not enough interest in doing this work.
The Log4j example alone, given it took 8 years, is enough to demonstrate that.
Everything is an illusion of trust, nothing is perfect; all we can do is try to align the interests of those working on projects with the interests of society — which is so hard that it's an entire field of study called "politics".
I don't agree with relying on the many eyeballs argument for security, but from a privacy standpoint, I do think at least the availability of source to MY eyeballs, as well as the ability to modify, recompile, and deploy it, is better than "trust me bro I'm your uncle Steve Jobs and I know more about you than you but I'm a good guy".
If you want to, for example, compile a GPS-free version of Android that appears like it has GPS but in reality just sends fake coordinates to keep apps happy thinking they got actual permissions, it's fairly straightforward to make this edit, and you own the hardware so it's within your rights to do this.
Open-source is only part of it; in terms of privacy, being able to see what all is being sent in/out of my device is is arguably more important than open source. Closed source would be fine if they allowed me to easily inject my own root certificate for this purpose. If they aren't willing to do that, including a 1-click replacement of the certificates in various third-party, certificate-pinning apps that are themselves potential privacy risks, it's a fairly easy modification to any open source system.
A screen on my wall that flashes every JSON that gets sent out of hardware that I own should be my right.
> Open-source is only part of it; in terms of privacy, being able to see what all is being sent in/out of my device is is arguably more important than open source.
I agree; unfortunately it feels as if this ship has not only sailed, but the metaphor would have to be expanded to involve the port at well.
Is it even possible, these days, to have a functioning experience with no surprise network requests? I've tried to limit mine via an extensive hosts file list, but that did break stuff even a decade ago, and the latest version of MacOS doesn't seem to fully respect the hosts file (weirdly it partially respects it?)
> A screen on my wall that flashes every JSON that gets sent out of hardware that I own should be my right.
I remember reading a tale about someone, I think it was a court case or an audit, who wanted every IP packet to be printed out on paper. Only backed down when the volume was given in articulated lorries per hour.
I sympathise, but you're reminding me of that.
> Apple has never been privacy-first in practice > They also make it a LOT harder than Android to execute your own MITM proxies
I would think ease of MITM and privacy are opposing concerns
Yeah, given that they resisted putting RCS in iMessage so long, I am a bit skeptical about the whole privacy narrative. Especially when Apple's profit is at odds with user privacy.
From my understanding, the reason RCS was delayed is because Google's RCS was E2EE only in certain cases (both users using RCS). But also because Google's RCS runs through Google servers.
If Apple enabled RCS in messages back then, but the recipient was not using RCS, then Google now has the decrypted text message, even when RCS advertises itself as E2EE. With iMessage, at least I know all of my messages are E2EE when I see a blue bubble.
Even now, RCS is available on Android if using Google Messages. Yes, it's pre-installed on all phones, but OEMs aren't required to use it as the default. It opens up more privacy concerns because now I don't know if my messages are secure. At least with the green bubbles, I can assume that anything I send is not encrypted. With RCS, I can't be certain unless I verify the messaging app the recipient is using and hope they don't replace it with something else that doesn't support RCS.
You know what would really help Apple customers increase their privacy when communicating with non-Apple devices?
Having iMessage available to everyone regardless of their mobile OS.
Agreed. While I have concerns regarding RCS, Apple's refusal to make iMessage an open platform due to customer lock-in is ridiculous and anti-competitive.
> "due to customer lock-in"
Their words or your words?
“moving iMessage to Android will hurt us more than help us.”
RCS is a net loss for privacy: it gives the carriers visibility into your social graph and doesn’t support end to end encryption. Google’s PR campaign tried to give the impression that RCS supports E2EE but it’s restricted to their proprietary client.
On top of that, rooted devices are denied access to it, which means Google is now gatekeeping a "carrier" service on top of that even more.
> rooted devices are denied access to it
By what? It's impossible for a process to know for sure if the system is rooted or not. A rooted system can present itself to a process to look like a non-rooted system if it's engineered well enough.
I'd bet that most of these apps probably just check if "su" returns a shell, in which case perhaps all that's needed is to modify the "su" executable to require "su --magic-phrase foobar" before it drops into a root shell, and returns "bash: su: not found" or whatever if called with no arguments.
>A rooted system can present itself to a process to look like a non-rooted system if it's engineered well enough.
That was true 20 years ago, but most smartphones these days have cryptograhically-verified boot chains and remote attestation of how the boot went.
How is RCS a win on the privacy front? It's not even e2e encrypted in an interoperable way (Google implementation is proprietary).
Have you seen Apple's latest ad? It's a press crushing the tools and results of human creativity:
https://nitter.esmailelbob.xyz/tim_cook/status/1787864325258...
Edge inference and cloud inference are not mutually exclusive and chances are any serious player would be dipping their toes in both.
Right. The difference is that Apple has a ton of edge capacity, they’ve been building it for a long time.
Google and Samsung have been building it too, at different speeds.
Intel and AMD seem further behind (at the moment) unless the user has a strong GPU, which is especially uncommon on the most popular kind of computer: laptops.
And if you’re not one of those four companies… you probably don’t have much capable consumer edge hardware.
>Apple's AI strategy is to put inference (and longer term even learning) on edge devices
Ironic, given that AI requires lots of VRAM.
> This is completely coherent with their privacy-first strategy (...)
I think you're trying too hard to rationalize this move as pro-privacy and pro-consumer.
Apple is charging a premium for hardware based on performance claims, which they need to create relevance and demand for it.
There is zero demand for the capacity for running computationally demanding workloads beyond very niche applications, for what classifies as demanding for the consumer-grade hardware being sold for the past two decades.
If Apple offloads these workloads to the customer's own hardware, they don't have to provide this computing capacity themselves. This means no global network of data centers, no infrastructure, no staff, no customer support, no lawyer, nothing.
More importantly, Apple claims to be pro privacy but their business moves are in reality in the direction of ensuring that they are in sole control of their users' data. Call it what you want but leveraging their position to ensure they hold a monopoly over a market created over their userbase is not a pro privacy move, just like Apple's abuse of their control over the app store is not a security move.
>I personally really really welcome as I don't want the AI future to be centralised in the hands of a few powerful cloud providers.
Watch out for being able to using ai on your local machine and those ai services using telemetry to send your data (recorded conversations, for instance) to their motherships.
I agree but for a different reason.
Now the subscription is 20$ a month and the API price is accessible. What will happen when they all decide to x100 or x1000 the price of their API ? All the companies that got rid of people in favor of AI, might have lost the knowledge as well. This is dangerous and might kill a lot of companies no ?
> and those ai services using telemetry to send your data (recorded conversations, for instance) to their motherships
This doesn’t require Ai and I am not aware of any instances of this happening today, so what exactly are we watching out for?
There is no guarantee that local processing is going to have lower latency than remote processing. Given the huge compute needs of some AI models (e.g. chat gpt) the time saved by using larger compute likely dwarfs the relatively small time need to transmit a request.
>Apple's AI strategy is to put inference (and longer term even learning) on edge devices.
Apple's AI strategy is to put inference (and longer term even learning) on edge devices...only for Apple stuff.
There is a big difference. ANE right now is next to useless for anything not Apple.
I think that’s going to change with WWDC
Nah. Apple doesn't have incentive to provide any more dev power. It will keep things locked down and charge people for Apple branded software products. That has been their business for the past decade.
I think there's always been a tension at Apple between keeping everything as locked down as possible and opening up parts because they need the developer driven app ecosystem. My prediction is Neural Engine is going to become more useful to third party developers. I could be wrong
My cynical view is that doing AI on the client is the only way they can try to keep selling luxury items (jewelry really) and increasing prices for what are essentially and functionally commodity devices.
They're a hardware company, so yes, they want to sell thick clients.
> This is completely coherent with their privacy-first strategy
How can they have a privacy first strategy when they operate an Ad network and have their Chinese data centers run by state controlled companies?
... I think that the more correct assertion would be that Apple is a sector leader in privacy. If only because their competitors make no bones about violating the privacy of their customers as it is the basis of thier business model. So it's not that Apple is A+ so much as the other students are getting Ds and Fs.
How can I have mint choc and pineapple swirl ice cream when there are children starving in Africa?
Fuck them kids. It's delicious af.
In case it is not abundantly clear by now: Apple's strategy is to turn developers into slaves.
Run away from these temptations. You will never truly own the hardware anyway.
Prob beacuse they are like super-behind in the cloud space, it is not like they wouldn't like to sell the service. They ignored photos privacy quite a few times in the icloud.
is it surprising since they effectively given the finger to data center hardware designs?
I hope this means AI-accelerated frameworks get better support on Mx. Unified memory and Metal are a pretty good alternative for local deep learning development.
So for hardware accelerated training with something like PyTorch, does anyone have a good comparison between Metal vs Cuda, both in terms of performance and capabilities?
Then putting only 256gb into their cheaper devices is really bad move. Even simple models like Whisper require hundreds of megabytes of storage.
Privacy from everyone but Apple, certainly.
Every chip coming this year (Intel, AMD, Qualcomm) has an AI processor. I am not sure Apple is doing anything special here.
Yes this began with the acquisition of xnor.ai. Absolutely amazing what will be done (and is being done) with edge computing.
If inference on edge devices is their goal, then they would have to rethink their pricing on storage and RAM.
> This is completely coherent with their privacy-first strategy
You mean .. with their said privacy-first strategy
How is local more private? Whether AI runs on my phone or in a data center I still have to trust third parties to respect my data. That leaves only latency and connectivity as possible reasons to wish for endpoint AI.
If you can run AI in airplane mode, you are not trusting any third party, at least until you reconnect to the Internet. Even if the model was malware, it wouldn’t be able to exfiltrate any data prior to reconnecting.
You’re trusting the third party at training time, to build the model. But you’re not trusting it at inference time (or at least, you don’t have to, since you can airgap inference).
What are the example of the edge devices made by Apple?
MacBook, iPhone, iPad?
So laptops are now edge devices?
Doesn’t it just refer to the end user device as opposed to a server somewhere or a middle box?
Last time I looked for the definition, nobody can agree on whether client devices count as edge or not.
These don’t sit on the edge of the internet , and typically are not called edge devices.
It’s usually a more powerful device such as a router or mini server between LAN and internet.
Apple privacy is marketing https://www.eurekalert.org/news-releases/1039938
Ehhh at this point Apple’s privacy strategy is little more than marketing. Sure they’ll push stuff to the edge to save themselves money and book the win, but they also are addicted to the billions they make selling your searches to Google.
Agreed on the UX improvements though.
iPhone Photos app already does incredible image subject search via ML locally. Versus Android which does it via cloud.
Apple is UX-first, not privacy-first.
> privacy-first strategy
That's just their way of walled gardening apple customers. Then they can extort devs and other companies dry without any middle-men.
> This is completely coherent with their privacy-first strategy
Is this the same apple whose devices do not work at all unless you register an apple account?
That's not true though? I reset and set up devices for testing all the time, and you can skip logging into an Apple ID.
For real usage, not for testing a single app. And I mean phones.
Some people really seem to be truly delusional. It's obvious that the company's "privacy" is a marketing gimmick when you consider the facts. Do people not consider the facts anymore? How does somebody appeal to the company's "privacy-first strategy" with a straight face in light of the facts? I suppose they are not aware of the advertising ID that is embedded in all Apple operating systems. That one doesn't even require login.
What are the facts that people are not considering?
The advertising ID is useless as of App Tracking Transparency.
Considering the facts is much harder when admitting a mistake is involved.
A "mistake" seems to be putting it lightly when the thing has been reiterated multiple times throughout the years, but yeah. Seems more like blind dogma. Obviously people don't like the facts pointed out to them either as you can tell by the down votes on my comment. If I am wrong, please tell me how in a reply.
Honestly, if they manage this, they have my money. But to get actually powerful models running, they need to supply the devices with enough RAM - and that's definitely not what Apple like to do.
> This is completely coherent with their privacy-first strategy
Do not believe what they say, watch what they do.
> This is completely coherent with their privacy-first strategy (which would be at odds with sending data up to the cloud for processing).
I mean yeah, that makes good marketing copy, but its more due to reducing latency and keeping running costs down.
but as this is mostly marketing fluff we'll need to actually see how it performs before casting judgment on how "revolutionary" it is.
And yet Siri is super slow because it does the processing off-device, and is far less useful than it could be because it is cobbled with restrictions.
I can't even find a way to resume playing whatever Audible book I was last playing. "Siri play audible" or something. As far as I know, this is impossible to do.
Why is Siri still so terrible though?
> This is completely coherent with their privacy-first strategy (which would be at odds with sending data up to the cloud for processing).
I feel like people are being a bit naïve here. Apple's "Privacy First" strategy was a marketing spin developed in response to being dead-last in web-development/cloud computing/smart features.
Apple has had no problem changing their standards by 180 degrees and being blatantly anti-consumer whenever they have a competitive advantage to do so.
Having worked at Apple I can assure you it's not just spin. It's nigh on impossible to get permission to even compare your data with another service inside of Apple and even if you do get permission the user ids and everything are completely different so theres no way to match up users. Honestly its kind of ridiculous the lengths they go to and makes development an absolute PITA.
That could very well be true, but I also think it could change faster than people realize. Or that Apple has the ability to compartmentalize (kind of like how Apple can advocate for USB C adoption in some areas and fight it in others).
I'm not saying this to trash Apple - I think it's true of any corporation. If Apple starts losing revenue in 5 years because their LLM isn't good enough because they don't have enough data, they are still going to take it and have some reason justifying why theirs is privacy focused and everyone else is not.
As an Apple alum, I can agree with everything you’ve said.
"As a prior employee of the government, I also saw nobody mishandle our personal information."
Limited firsthand accounts don't really assuage my fears in the way that, say, Android or Linux does.
Of course! The difference is that, for the time being, my incentives are aligned with theirs in regards to preserving my privacy.
The future is always fungible. Anyone can break whatever trust they've built very quickly. But, like the post you are replying to, I have no qualms about supporting companies that are currently doing things in my interest and don't have any clear strategic incentive to violate that trust.
Edit: that same incentive structure would apply to NVIDIA, afaik
I can't agree with your comment. apple has all the incentives to monetize your data, that's the whole value of Google and Meta. And they are already heading into ad-business earning billions last I've checked. Hardware ain't selling as much as before, this isn't going to change for the better in foreseeable future.
The logic is exactly same as ie Meta claims - we will pseudoanonymize your data, so technically your specific privacy is just yours, see nothing changed. But you are in various target groups for ads, plus we know how 'good' those anon efforts are when money are at play and corporations are only there to earn as much money as possible. Rest is PR.
I'll disagree with your disagreement - in part at least. Apple is still bigger than Meta or Google. Even if they had a strong channel to serve ads or otherwise monetize data, the return would represent pennies on the dollar.
And Apple's privacy stance is a moat against these other companies making money off of their customer base. So for the cost of pennies on the dollar, they protect their customer base and ward off competition. That's a pretty strong incentive.
Persuasive, thank you
Don't bother the fanboys have an Apple can't do anything wrong/malicious. At this point it's closer to a religion than ever.
You would be amazed at the response of some of them when I point out some shit Apple does that make their products clearly lacking for the price, the cognitive dissonance is so strong they don't know how to react in any other way than lying or pretending it doesn't matter.
If you’re annoyed about quasi-religious behavior, consider that your comment has nothing quantifiable and contributed nothing to this thread other than letting us know that you don’t like Apple products for non-specific reasons. Maybe you could try to model the better behavior you want to see?
How do you even come to the conclusion that I don't like Apple products? I have a phone, watch and computer from them. It's not like I hate the products.
I have very specific reasons to be annoyed but they are far too many to list them all in a simple post. I was working with and buying Apple stuff before the turn of the millenium and I have worked as an Apple technician and helped way more Apple users than I could care to list. This is my experience and your comment precisely illustrates that.
My comment was about the delusion of privacy first marketing bullshit that they came up with to excuse the limitations of some of their stuff.
But since Apple can't do anything wrong, we are not going anywhere. Whatever, keep on believing.
Your comment is literally more subjective, dismissive, and full of FUD than any other on on this thread. Check yourself.
Considering you commented on another one of my comments about the Apple "special sauce magic' RAM I can see how you could think that.
I'll check myself thanks, but you should check your allegiance to a trillion dollar corp and its bullshit marketing, that's really not useful to anyone but them.
their privacy strategy is to make you feel comfortable with their tech so you don't mind when they shop it around to the highest bidder.
Make no mistake, they're just waiting for the right MBA to walk through the door, see the sky high value of their users and start chop shopping that.
Enshitiffication is always available to the next CEO, and this is just going to be more and more tempting as the value of the walled garden increases.
Yes, it’s possible that they’ll change in the future but that doesn’t make it inevitable. Everything you describe could have happened at any point in the last decade or two but didn’t, which suggests that it’s not “waiting for the right MBA” but an active effort to keep the abusive ones out.
One thing to remember is that they understand the value of long-term investments. They aren’t going to beat Google and Facebook at advertising and have invested billions in a different model those companies can’t easily adopt, and I’m sure someone has done the math on how expensive it would be to switch.
> [Apple's] privacy-first strategy
That is a marketing and advertising strategy.
I'm a cross-platform app developer and can assure you iOS is much stronger in terms of what data you can/can't get from a user
That says nothing about what Apple does with data.
- dozens of horrific 0days cves every year because not enough is invested in security, making private virtually impossible
- credit card required to install free apps such as the "private" Signal messenger
- location required just to show me the weather in a static place, lol
- claims to be e2e but apple controls all keys and identities
- basically sells out all users' icloud data in china, and totally doesn't do the same in the US, because tim pinky swears
- everything is closed source
Apple is privacy last, if anything. Forgotten PRISM already?
> complete independence of network connectivity and hence minimal latency.
Does it matter that each token takes additional milliseconds on the network if the local inference isn't fast? I don't think it does.
The privacy argument makes some sense, if there's no telemetry leaking data.