777 comments
  • Bukhmanizer4d

    I’m surprised not many people talk about this, but a big reason corporations are able to do layoffs is just that they’re doing less. At my work we used to have thousands of ideas of small improvements to make things better for our users. Now we have one: AI. It’s not that we’re using AI to make all these small improvements, or even planning on it. We’re just… not doing them. And I don’t think my experience is very unique.

    • baazaa4d

      I think people need to get used to the idea that the West is just going backwards in capability. Go watch CGI in a movie theatre and it's worse than 20 years ago, go home to play video games and the new releases are all remasters of 20 year old games because no-one knows how to do anything any more. And these are industries which should be seeing the most progress, things are even worse in hard-tech at Boeing or whatever.

      Whenever people see old systems still in production (say things that are over 30 years old) the assumption is that management refused to fund the replacement. But if you look at replacement projects so many of them are such dismal failures that's management's reluctance to engage in fixing stuff is understandable.

      From the outside, decline always looks like a choice, because the exact form the decline takes was chosen. The issue is that all the choices are bad.

      • nisa4d

        My personal theory is that this is the result of an incompetent management class where no self corrections are happening.

        In my work experience I've realized everybody fears honesty in their organization be it big or small.

        Customers can't admit the project is failing, so it churns on. Workers/developers want to keep their job and either burn out or adapt and avoid talking about obvious deficits. Management is preoccupied with softening words and avoiding decisions because they lack knowledge of the problem or process.

        Additionally there has been a growing pipeline of people that switch directly from university where they've been told to only manage other people and not care about the subject to positions of power where they are helpless and can't admit it.

        Even in university, working for the administration I've watched people self congratulation on doing design thinking seminars every other week and working on preserving their job instead of doing useful things while the money for teaching assistants or technical personnel is not there.

        I've seen that so often that I think it's almost universal. The result is mediocre broken stuff where everyone pretends everything is fine. Everyone wants to manage, nobody wants to do the work or god forbid improve processes and solve real problems.

        I've got some serious ADHD symptoms and as a sysadmin when you fail to deliver it's pretty obvious and I messed up big time more than once and it was always sweet talked, excused, bullshitted away from higher ups.

        Something is really off and everyone is telling similar stories about broken processes.

        Feels like a collective passivity that captures everything and nobody is willing to admit that something doesn't work. And a huge missallocation of resources.

        Not sure how it used to be but I'm pessimistic how this will end.

        • AnthonyMouse3d

          > My personal theory is that this is the result of an incompetent management class where no self corrections are happening.

          This is really a cultural problem that has infected management along with everyone else.

          It used to be that you were expected to be able to fix your own car or washing machine, and moreover that one you couldn't fix would be rejected by the customers. It was expected to come with documentation and be made of modular parts you could actually obtain for less than three quarters of the price of the entire machine.

          Now everything is a black box you're expected to never open and if it breaks and the manufacturer doesn't deign to fix it you go to the store and buy another one.

          The problem with this is that it poisons the well. Paying money to make the problem go away instead of learning how to fix it yourself means that, at scale, you lose the ability to fix it yourself. The knowledge and infrastructure to choose differently decays, so that you have to pay someone else to fix the problem, even if that's not what you would have chosen.

          The result is a helplessness that stems from a lack of agency. Once the ability to do something yourself has atrophied, you can no longer even tell whether the person you're having do it for you is doing it well. Which, of course, causes them to not. And in turn to defend the opacity so they can continue to not.

          Which brings us back to management. The C suite doesn't actually know how the company works. If something bad happens, they may not even find out about it, or if they do it's through a layer of middle management that has put whatever spin on it necessary to make sure the blame falls on the designated scapegoat. Actually fixing the cause of the problem is intractable because the cause is never identified.

          But to fix that you'd need an economy with smaller companies, like a machine with modular parts and documented interfaces, instead of an opaque monolith that can't be cured because it can't be penetrated by understanding.

          • beowulfey3d

            >Paying money to make the problem go away instead of learning how to fix it yourself means that, at scale, you lose the ability to fix it yourself.

            This is very insightful and, in my mind, a good preview of what is happening with AI right now. We will forget how to use the skills that built these systems in the first place.

            • InDubioProRubio3d

              Dont worry mate, AI can teach us if we ever need it again.

              • pcthrowaway3d

                Poe's law in action. I assume this is sarcasm but you never know

            • al_borland3d

              There was a South Park special about this. I think it was Into the Panderverse as a 2nd plot line.

              The handy men of the future were like today’s tech bros. They were loaded, because people couldn’t perform basic tasks around the house. When a father was looking to teach his son how to fix the oven, he showed him how to call the handy man.

          • calvinmorrison3d

            > > My personal theory is that this is the result of an incompetent management class where no self corrections are happening.

            > This is really a cultural problem that has infected management along with everyone else.

            Because every time a natural correction happens, the government bails them out

          • InDubioProRubio3d

            The management caste was afraid like the devil of holy water that aidealised, socialist society machinery from beyond the iron curtain could produce products better, longer lasting and superior in all aspects. The system people imagined up until the 80s was something similar to atomic heart- and that was what drove systemic competition. It drove quality, it drove investment, it protected the state against subversion and destructive ideologies like liberalism - because the state was the big protector against the thing with even more state. The systemic competition.

            Capitalism needs a deadly threat to be good.

          • akoboldfrying3d

            > Now everything is a black box you're expected to never open and if it breaks and the manufacturer doesn't deign to fix it you go to the store and buy another one.

            Do you own a PinePhone?

            Or do you own a higher-spec, more familiar iPhone or Android that can't be opened up?

            It's the second one, isn't it. Who made you choose it?

            • saturn86013d

              >Do you own a PinePhone?

              Yes. Its a piece of junk. Why do I own it? I like to throw my money away on ideals I never actually follow. Its sitting next to my unplayed guitar, my list of books on how to effectively get A's in college (I ended up a C+ student) and my Raspberry Pi that has only ever been powered on once.

            • BOOSTERHIDROGEN3d

              What you are trying to say? Can you be more explicit.

              • Cheer21713d

                They are trying to make the argument that is parodied in this meme: https://knowyourmeme.com/memes/we-should-improve-society-som...

              • ourhouse5093d

                I think they're trying to convey "don't throw stones at glass houses." It sounds like they're trying to make the argument that if you're not practicing what you're preaching then you shouldn't preach. Not saying I agree with the sentiment; but I think that's what they're going for.

                The reality is that you can have it both ways. I own an iPhone, I know how to build a computer, I buy software, and I know how to code. There is value in understanding how the things you have work, but that doesn't mean that you can't or shouldn't buy a high quality product just because you can't take it apart.

              • akoboldfrying3d

                The comment I responded to implies a lack of availability of repairable devices. I'm trying to make the point that the market does in fact offer them, but that consumer choice is overwhelmingly in favour of locked-down shiny things -- leading to wild proliferation of the latter. This preference is so strong that even those who complain about the lack of repairability choose locked-down-and-shiny over repairability, perhaps without even consciously realising it.

                It's tiring to read again and again about evil external forces wrecking the world, when the choices are our own, and right in front of our faces.

                • wing-_-nuts3d

                  Look, there is a large difference between certain state of the art devices that I would not have a hope of repairing, and things that can be simple to repair like major appliances, cars, etc. I'm currently doing a load of clothes in my washer and drier. They're commercial grade beasts from the 90's, and when they've broken, we've been able to fix them with cheap replacement parts.

                • Teever3d

                  But you understand that the quality of those repairdble devices that exist on the mark is far less than the quality of the standard devices on the market because more money goes into their R&D due to the scale of entrenched players and that this means that these goods aren't directly comparable.

                  I would prefer a scenario where monopolists were broken up and regulators mandated open designs that can be repaired.

            • iamtedd3d

              The PinePhone still has to be good enough to make it a sensible choice over the iPhone. Ideology can't be the only selling point.

              Compare the market success of the PinePhone to the Framework laptop. Their laptops are technically competitive with the Dells and the HPs of the world, while also being repairable.

              The PinePhone doesn't even beat the until-recently-current iPhone SE in performance. It's a terrible choice, technically speaking.

            • Spivak3d

              The fact that the best phone is a black box for the financial benefit of the corporation that made it isn't exactly the point you think it is.

              • neuralRiot3d

                The thing is that most of the public demand 2025 technogic “marvels” with the accessibility of the 80’s, for a device like a phone to be able to fit in your pocket and have a battery that can make it run the whole day and beyond some compromises need to be made. https://cdn.pressebox.de/a/48cf30b132272045/attachments/0663...

                That is what a 0201 capacitor looks like.

            • AnthonyMouse3d

              > Do you own a PinePhone?

              So this is why it's a cultural issue.

              Let's consider a market that still works basically like it's supposed to: Desktop PCs. You have your ATX standard PC, it came with a Core i3 processor which is getting a little long in the tooth, but you can drop in a Core i7 and double the number of cores. Not only that, the parts are all modular and standard. You take your ten year old i3 6100 dual core, swap out the motherboard and CPU and now it's a 16-core Ryzen 9 5900XT from 2024, but it still supports the same memory, GPU, SSD, chassis, power supply, etc., any of which you could also have independently replaced before or after this.

              So now I go and buy a PinePhone, and after a couple years the CPU seems a little anemic. No problem, it's modular, I'll just buy one of those fancy chips they put in the iPhones and put that in there. Or at least the top end things from Samsung or Qualcomm. No? That's not available?

              Okay, but at least I can put whatever software I want on it. Now the way this works is, people can improve their own devices in collaboration with other people. Adding a new subsystem to your phone would be a full time job, but it could also be a dozen part time jobs. Somebody does a barebones implementation and throws it on github, then you personally only need it to do one extra thing and all you have to do is add the extra thing instead of starting from scratch, which is a tractable problem instead of a hopeless pipe dream. But when each person contributes a little part, you ultimately end up with a complete implementation. Most of the users don't even have to contribute anything, as long as there is a large enough community of people who do.

              Except that 99% of people have locked down devices, so the community is suppressed and then even if you buy the device that allows you to do it, you're the only one working on that subsystem and it's too much work for you to do yourself, so you don't even make the attempt. And then what good is the device?

              It's an ecosystem problem. A cultural issue. It can't be just you. You need the default attitude of the common customer to be "this despotism will not stand" and to give the finger to any company that locks you out of your own property. Regardless of whether you personally actually upgrade your own device or write your own code, you need everyone to have the ability to do it, because the alternative is a friction that erodes the community and in turn destroys a backstop against involuntary captivity.

        • somenameforme4d

          I think a way to sum this up is simply metric optimizing. As organizations and companies grow larger the need to evaluate people at scale becomes necessary. And so metrics are used, and people then naturally start to optimize around those metrics. But it seems to invariably turns out that any sort of metric you create will not effectively measure progress towards a goal you want to achieve, when that metric ends up being optimized for.

          The traditional term for this is cobra effect. [1] When the Brits were occupying India they wanted to reduce the cobra population, so they simply created a bounty on cobra heads. Sounds reasonable, but you need to have foresight to think about what comes next. This now created a major incentive for entrepreneurial Indians to start mass breeding cobras to then turn in their heads. After this was discovered, the bounty program was canceled, and the now surging cobra farm industry mostly just let their cobras go wild.

          I think the fundamental problem is that things just don't work so well at scale, after a point. This is made even worse by the fact that things work really well at scale before they start to break down. So we need a large economy that remains relatively decentralized. But that's not so easy, because the easiest way to make more money is to just start assimilating other companies/competitors with your excess revenue. Anti-trust is the knee jerk answer but even there, are we even going to pretend there's a single person alive who e.g. Google (or any other mega corp) doesn't have the resources to 'sway'?

          [1] - https://en.wikipedia.org/wiki/Perverse_incentive

          • Seattle35034d

            I think there is something to the idea that there are roo many too large firms doing abstract work. People have become detached from their impact.

            • a_bonobo3d

              https://en.wikipedia.org/wiki/Marx%27s_theory_of_alienation

              >Karl Marx's theory of alienation describes the separation and estrangement of people from their work, their wider world, their human nature, and their selves. Alienation is a consequence of the division of labour in a capitalist society, wherein a human being's life is lived as a mechanistic part of a social class.[1]

              • skyyler3d

                One of my favourite things about internet forums is watching people re-invent Marxist theory by being mad at the current state of things.

                It's been like this for a while now.

                I think I even saw someone in a conservative subreddit suggest that everyone should work on a farm for a few years after college before they get real jobs. I'm still unable to determine if this was a troll or if a well-meaning conservative actually reinvented Mao's Down to the Countryside movement.

                • silverquiet3d

                  My favorite was some comment on Reddit or other observing how often people had to resort to paying medical bills via GoFundMe. They had the idea to create one large pool of money in order to pay the medical bills of all citizens. It is often hard to tell trolling from genuine incompetence.

                • somenameforme3d

                  Marx doesn't make any real sense in modern times because of his obsession on class divides. In contemporary society there's no real difference between a capitalist and a worker. This is true even in his own terms since we all literally own one of the most valuable 'means of production' - a computer. Obviously I'm in no way saying that there aren't invisible classes in society, but that these don't define our possibilities in ways at all comparable to the early 19th century.

                  People also seem to try to shoe horn him into every topic, even when it really doesn't fit. For instance this issue is not one about some group of melancholy workers being alienated from the product, but 'capitalists' who have become so detached from their product that they are left looking at things through a sort of compression lens that leaves them with a deeply distorted view of reality. Even with your example - I agree that learning 'life skills' is extremely important for a solid development, but Mao wasn't doing that - he was effectively exiling people to rural areas, largely to replenish populations after massive famines that were created by his other harebrained schemes.

                  • skyyler2d

                    I want to focus on something you said here: >In contemporary society there's no real difference between a capitalist and a worker.

                    The difference is access to capital. Just like it was 150 years ago. Workers don't have enough holdings to sustain themselves without selling their body. Capitalists have enough holdings to not have to sell their body and can instead put their money to work through various means like entrepreneurship.

                    Also, I didn't even bring up the Down to the Countryside program as a good aspect of Mao... But since you brought it up, I figured I'd mention that his "harebrained schemes" doubled the life expectancy in China rather quickly. Like all world leaders I've studied, he did great things, and he did horrible things.

                    • somenameforme2d

                      I wrote a lengthier post, but in writing it I realized there's a simple way to cut to the heart of this issue. Many workers in various fields (tech, legal, medicine, and more) now tend to make substantially more money than many business owners, and often for far less hours worked. In this world how does the notion of capitalist vs worker make any sense? Let alone with the stereotypes Marx depended upon for his arguments?

                      • skyyler2d

                        >In this world how does the notion of capitalist vs worker make any sense?

                        Well paid workers can amass the means to become capitalists.

                        >Let alone with the stereotypes Marx depended upon for his arguments?

                        Marx called these types of people that make enough money to own their own means of production "petit bourgeoisie". This is in contrast to the "haute bourgeoisie".

                        This isn't some exception to Marxist thought; this is literally one of the core components of Marxist thought.

                        • somenameforme2d

                          Wiki tends to be obsessively fond of Marxist stuff, and gives a very different definition for petit bourgeoisie:

                          ---

                          "Karl Marx and other Marxist theorists used the term petite bourgeoisie to academically identify the socio-economic stratum of the bourgeoisie that consists of small shopkeepers and self-employed artisans.

                          The petite bourgeoisie is economically distinct from the proletariat social-class strata who rely entirely on the sale of their labour-power for survival. It is also distinct from the capitalist class haute bourgeoisie, defined by owning the means of production and thus deriving most of their wealth from buying the labour-power of the proletariat..."

                          ---

                          The critical distinction being that they aren't 'selling their labor-power' to others.

                          And I just don't see how one can claim this makes any sense in modern times! Proles selling their 'labor power' are out-earning the bougies, anybody (even relatively low wage workers) can hire the 'labor-power of the proletariat' with things like Fiverr (amongst many others). And basically everybody owns the most valuable means of production in modern society - a computer. If you don't, you can buy one with a day or so of minimum wage work.

                          For that matter bougies in modern times don't make wealth their from buying labor power - they mostly just dump money into investments, bonds, and other such financial vessels. Bonds right now are at near 5%! And again the distinctions really fail because the same is also true of retail investors with a a Robin Hood or whatever.

                          [1] - https://en.wikipedia.org/wiki/Petite_bourgeoisie

                          • dragonwriter2d

                            > And I just don't see how one can claim this makes any sense in modern times! Proles selling their 'labor power' are out-earning the bougies

                            No, they generally are not. There is obviously overlap, as there was in Marx's time, in income, but that’s not a problem with the theory—class isn’t about income but mode of participation in the economy.

                            > For that matter bougies in modern times don't make wealth their from buying labor power - they mostly just dump money into investments, bonds, and other such financial vessels.

                            The “financial vessels” are instruments of other entities, most of which exist by rented labor power.

                            > And again the distinctions really fail because the same is also true of retail investors with a a Robin Hood or whatever.

                            The distinctions have never been hard lines. In the most simplistic analysis class is determined by the predominant mode of interaction with the economy, while a more nuanced view sees class membership as essentially a fuzzy membership function, depending on the degree to which one interacts in the manner (selling labor to capitalists vs applying your own labor to your own capital vs. owning capital to which rented labor is applied) archetypical of a given class (both these modes of a analysis have been around for quite a while, thougj the fuzzy membership function language would only be used fairly recently.)

                            • somenameforme2d

                              > "that’s not a problem with the theory—class isn’t about income but mode of participation in the economy."

                              We can challenge this assertion by reductio ad absurdum. Imagine somehow all bougies earned less than all workers. Everything Marx said would be absolutely and completely nonsensical. There's nothing inherently impossible about such a world existing and it makes clear the point that income levels do absolutely matter. And in Marx's time I think it is fairly safe to say there would have been exactly 0 proles earning more than bougies. The concept of a 'factory' worker earning more than a factory owner would have been entirely alien to him, and most of the world, until fairly recently.

                              The most paradoxical thing about all of this is that the people most drawn to Marxist stuff are disproportionately in tech, the exact sort who, in many cases, already earn more than many, and likely most, business owners, work far fewer hours, and generally have dramatically nicer working conditions. I think it's mostly misidentified discontent. It's not the economic system that's at fault, but somehow building things in the digital world is fundamentally unsatisfying and unfulfilling, even if you get drowned in money, massages, bean bag chairs, and ping pong tables.

                              If people want fulfilling lives (so far as work as concerned) don't work in ad-tech. If you want stupid amounts of money work in ad-tech. You get the stupid amounts of money precisely because the work is awful and empty. It's a rather dramatically different world from Marx's time where, in general, work was awful and compensation was awful.

                              • dragonwriter2d

                                > Imagine somehow all bougies earned less than all workers. Everything Marx said would be absolutely and completely nonsensical.

                                I mean, it wouldn't, if they still exercised power. But...they don't, while there is overlap on the boundaries, the classes defined by modes of interaction do, across every capitalist economy (including modern mixed economies, which are not the same system as the capitalism that Marx named and addressed, but share important features with it) form on aggregate hierarchy of both power and income in the same order that as the heirarchy of power Marx describes them in, even though the ranges of individual incomes overlap.

                                > And in Marx's time I think it is fairly safe to say there would have been exactly 0 proles earning more than bougies.

                                No, definitely the most well-paid person-living-by-rented labor would have had a higher income than the least-successful owner of capital to which rented labor applied. Capitalists (then no less than now) are capable of losing money continuously, eventually reaching the point where they fall out of the bourgeoisie entirely, and even among those that are more fortunate than that, there would have been many who were technically haut bourgeois because they relied primarily on renting others labor to apply to their capital, and many more who were petit bourgeois and applying their own labor to their own capital--like homesteaders with small holdings--who would earn less the most successful hired experts.

                                > It's a rather dramatically different world from Marx's time where, in general, work was awful and compensation was awful.

                                Yes, in modern mixed economies the condition of the median worker is better than in the capitalism of Marx's time, but, in general, work is awful and compensation is awful. Sure, the small percentage of the workers in well-compensated positions like the ad-tech you point to may do amazingly well -- but that's a minute fraction of workers.

                                • somenameforme2d

                                  I just looked up the exact stats and it turns out my hypothetical world isn't hypothetical. Currently the average "small business" owner takes home less than $70k a year. [1] Small business in quotes because that term has been distorted so politicians can give handouts to big business and claim they're supporting small business. 99.9% of all businesses in the US are classified as "small business" which includes companies with hundreds of employees and revenue in the tens of millions of dollars, so the "average" there is misleadingly high.

                                  Factor in the fact that a business owner is going to be working far more hours on average, than a 'worker', and it turns out that we do live live in this apparently not-so-hypothetical world where proles make more than bougies if we just define classes by their 'modes of economic interaction'! We can argue/nitpick the specifics in Marx's time, but I don't think you can claim in good faith that the situation was even remotely like this, and his logic was largely based on the conditions that he lived in. Even the most fundamental concepts like means of production are obsolete because in modern times everybody owns the most valuable (by a very wide margin) means of production.

                                  And the pleasure or pain of labor is always relative to itself. For most people there's about a million things they'd rather be doing than working (including for business owners), but everybody has to put food on the plate and in modern times that's so much more pleasant an endeavor that it can't really be overstated, and this applies even to relatively recent times. When I, and I assume you, were growing up don't you remember getting endlessly spammed on TV with the non-stop 'Hurt on the job? Call Mr. Ambulance Chaser at 123-4567 today, and get what you deserve!'

                                  [1] - https://altline.sobanco.com/small-business-revenue-statistic...

                • a_bonobo3d

                  While that may be classic Marxist stuff, modern philosophers like Byung-Chul Han give a great twist on the whole thing in a digital age, I should've linked to his works too, especially for the self-optimising HN crowd.

                  Just quoting from Wikipedia:

                  >Han argues that subjects become self-exploiters: "Today, everyone is an auto-exploiting labourer in his or her own enterprise. People are now master and slave in one. Even class struggle has transformed into an inner struggle against oneself."[12] The individual has become what Han calls "the achievement-subject"; the individual does not believe they are subjugated "subjects" but rather "projects: Always refashioning and reinventing ourselves" which "amounts to a form of compulsion and constraint—indeed, to a "more efficient kind of subjectivation and subjugation." As a project deeming itself free of external and alien limitations, the "I" subjugates itself to internal limitations and self-constraints, which are taking the form of compulsive achievement and optimization.[13]

                • Seattle350316h

                  I'm not sure why you think I was unaware of Marx.

          • whearyou3d

            Agreed. Don’t know how we balance decentralization with the planetary civilization/economy’s complexity, which requires some kinds of centralization.

            That’s doubly difficult because the complexity is what lets the system produce so much output, and if you produced less people would experience that as having less and would riot. The only way out would be if the whole society consumes less, including, visibly, the elites. Feeling taken advantage of is a far more powerful force on the non elites compared to, up to a point, their material ups and downs.

            Trump’s ability to create a widely accepted narrative focused specifically on elites who are opposed to his power, but also who are doing extra well relative to the non elites, is what let him harness the raw force of wage stagnation et al for political power

          • puzzlingcaptcha4d

            Perhaps corporations should be organized like terrorist cells then.

        • bsenftner4d

          This crisis, which it is, is caused by the unrecognized necessity for effective communications within science and technology and business, which is not taught. Not really, only a lite "presentation skill" is taught.

          Fact of the matter: communications is everything for humans, including dealing with one's own self. Communications are how our internal self conversation mired in bias encourages or discourages behavior, communications are how peers lead, mislead, inform, misinform, and omit key information - including that critical problem information that people are too often afraid to relate.

          An effective communicator can talk to anyone, regardless of stature, and convey understanding. If the information is damningly negative, the effective communicator is thanked for their insight and not punished nor ignored.

          Effective communications is everything in our complex society, and this critical skill is simply ignored.

          • alabastervlog3d

            My dad worked his way up to middle management in a large railroad.

            Management and executives had almost all worked their way up the ladder. Toward the end I think some of the higher-up ones were encouraged to get an MBA as they advanced, but they didn't do much hiring of MBAs.

            The company got bought by another in IIRC the late 90s, and this other one had already been taking over by the "professional managerial class", and they quickly replaced most of the folks from the top down to the layer just above him with their own sort.

            His description of what followed was incredible amounts of waste. Not just constant meetings that should have been emails (though, LOTS of that) but entire business trips that could have been emails. Lots of them fucking things up because they had no idea how anything worked, but wouldn't listen to people who did know. Just, constant.

            The next step was they "encouraged" his layer to retire early, for any who were old enough, which was lots of them since, again, most of them had worked their way up the ladder to get where they were, not stepped straight into management as a 25-year-old with no clue how actual work gets done. I haven't asked, but I assume they replaced them with a bunch of young business school grads.

            There are sometimes posts on HN suggesting that our dislike of business school sorts is silly or overblown, but if anything I think it's too weak. The takeover by them and, relatedly, the finance folks has been disastrous for actual productivity and innovation. Companies should be run by people who've done the work that the company does, and not just for an internship or something.

            • EFreethought3d

              There are a lot of companies out there (HP, Intel, Boeing, GM, Xerox) where if you dive into the history, at some point somebody says something to the effect of: "This used to be a great engineering firm until the finance guys took over."

            • hylaride3d

              > The takeover by them and, relatedly, the finance folks has been disastrous for actual productivity and innovation.

              The fact that so many companies play tricks with CAPEX and OPEX completely misses the point that almost all corporate spending should be seen as investment or spending to support investment at some level.

              The past 50 years of business school has taught people that outsourcing your core competency is a good idea because it gets things "off the books" and makes quarterly reports look better. The end result was shifting huge swaths of our economy to a hostile country.

              Here in tech, I've literally seen companies shift stuff into the cloud even though it's more expensive, because OPEX can be written off right away and they don't want CAPEX on the books, only for a year later to want to shift back because they decided it's now better to optimize for actual cashflow. It's infuriating.

          • stuartjohnson123d

            I read Moral Mazes recently and what it describes is not a lack of communications skill, to the contrary, the incentives created by managerial social hierarchies place very high praise on difficult communications skills such as the ability to fluidly support contradictory positions on different issues, the ability to manipulate symbols and euphemism to justify necessary actions, the understanding of what makes others in their management circles feel good.

            • bsenftner3d

              What you're describing is the opposite end of the spectrum, those that do understand communications and language to the degree they can appear to fluidly support contradictory positions, but they are in fact operating at a higher communications level and spinning circles around those less adept in communications. They are masterful language and perception manipulators, in a strategic game of corporate dominance.

              • mlyle3d

                But, there's a hyperparameter here; we culturally and organizationally get to choose how much of this game exists and how effective it is.

                And certainly some of these games are useful; abilities of this kind are highly correlated with other abilities, and having masterful language and perception manipulators act for the interest of your company or nation is valuable.

                But it's not the only useful skill at the upper tier of organizations, and emphasizing it over all else is costly. So are internal political games-- when your organization plays too many of them, the benefits one gets from selecting these people and efforts are dwarfed by the infighting and wasted effort. It can also result in severe misalignment between individual and organizational incentives.

                • bsenftner3d

                  There is a misunderstanding that being an effective communicator equals political gaming of situations. That is possible with or without effective communications, and largely misses the point that effective communications is not playing games, it's avoiding them. It is not trying "to win", it is seeking shared understanding and consensus. If one's management is playing political games, they are failing in their communications, trying to win in some personal game, not for the betterment of the company.

                  • mlyle3d

                    > If one's management is playing political games, they are failing in their communications, trying to win in some personal game

                    Is this not A) ubiquitous, B) rich with incentives, and C) not downright implied in "They are masterful language and perception manipulators, in a strategic game of corporate dominance." and "the understanding of what makes others in their management circles feel good."

                    • bsenftner3d

                      This is the very difficult part: people adept at manipulation tend to be highly intelligent. Simply spending time with a good manipulator is dangerous. The only good metric I know here is the old saying "the key purpose of an education is to be able to recognize one in others." Good communicators also sort out weasels via their lack of distinct language and similar tells.

          • drumdance2d

            For a while I thought about starting a nonprofit or foundation or whatever whose goal was get universities to adopt the coop model for all majors, not just engineering. The idea was to take a year and learn how the world actually works. We're talking literally basic business skills like how to run a meeting, how to do an effective presentation, etiquette for email and slack etc. Also give exposure to different types of work (office vs frontline vs outdoors etc) and industry types. 1/3 business, 1/3 nonprofit, 1/3 government.

        • baazaa4d

          While I suspect the root cause is managerial dysfunction ultimately the disease spreads everywhere. I've stopped honing my technical skills because I don't expect to ever work in an organisation sufficiently well-managed for it to matter. So then you end up with the loss of genuine technical expertise from generation to generation as well.

        • idra4d

          Sounds like hypernormalisation has now hit the West

          https://en.wikipedia.org/wiki/HyperNormalisation

          • no_wizard4d

            Holy moly. This lead me down a quick read to this gem[0]:

            > Aladdin (Asset, Liability and Debt and Derivative Investment Network)[1] is an electronic system built by BlackRock Solutions, the risk management division of the largest investment management corporation, BlackRock, Inc. In 2013, it handled about $11 trillion in assets (including BlackRock's $4.1 trillion assets), which was about 7% of the world's financial assets, and kept track of about 30,000 investment portfolios.

            For any one firm to have this much director and/or indirect assertion over the world’s financial assets is ripe for problems of all sorts.

            Seems rather indicative of the general consolidation of power and decline of social equality across the west

            [0]: https://en.m.wikipedia.org/wiki/Aladdin_(BlackRock)

        • __oh_es3d

          I would caveat with its not affordable to be passionate anymore. The top engineers (mech, chem, civil, etc) I know work in finance or consulting instead of doing things they care about.

          Closer to tech, I feel we have had a big influx on non-tech joining the tech workforce and the quality has suffered as a result of a lack of fundamentals and passion

          • no_wizard3d

            >Closer to tech, I feel we have had a big influx on non-tech joining the tech workforce and the quality has suffered as a result of a lack of fundamentals and passion

            In the web development community there is a near linear correlation between the number of “influencers” who sell courses that pray on this influx to make money and the influx of such folks.

            I miss the days where developers generally had a passion for this work vs seeing only a big paycheck, though without artificial barriers we should have expected a lot of influx of people given how well it generally paid for a long time

        • fijiaarone4d

          The cause of an incompetent management class is a subservient worker class. Now a subservient worker class is either that they are incompetent or they don’t have access to capital, meaning that they can’t strike out on their own and leave management to suffer the consequences of their incompetence.

          • Buttons8401d

            Disconnecting most benefits from employment would be a good start, especially healthcare. Imagine if small companies could focus on their product and customers instead of on being the entire social safety net for their employees.

        • whatever14d

          This started when companies decided that labor is fungible.

          The moment you admit failure as an employee, you are out of the company. And no for most people it is not easy to find a job that will not disrupt their lives (aka move cities, change financial planning, even health insurance).

          So employees do what they have to do. They will lie till the last moment and pretend that the initiatives they are working on are huge value add for the company.

          In the past you knew you would retire from your company, also the compensation differential was not that huge across levels, so there was little incentive to BS.

          Today everything is optimized with a horizon of a financial quarter. Then a pandemic hits, and we realize that we don't even know how to make freaking masks and don’t even have supplies of things for more than a week.

          • hansmayer4d

            Great points. I was also shocked to see on an example recently that even basic computer literacy is gone. We visited a couple of friends of ours recently. And as things go, at some point they (non-tech folks) asked me to help them with some printer settings on their new laptop - I am sure we all had experienced this many times over. So they pass me the notebook, not connected to power source. I noticed the battery was low and ask for the adapter to connect it. I proceed to tell them it was not a good idea to let the notebook battery go so low and that they should operate it on battery only when they don't really have access to power supply. The response - they had thought, they way you use a notebook was analogous to the mobile phone, i.e. you charge it up then use it all the way until the battery drops low, then rinse and repeat, etc. The smartphones have ruined our society in more ways than one.

            • nyarlathotep_3d

              > was also shocked to see on an example recently that even basic computer literacy is gone.

              Even with people that work with/in software roles there's often shocking knowledge gaps in areas that they work in. I've worked with more than one front-end "engineer" that only understood React--they had no conception of DOM APIs or that React was an abstraction on top of that whole underlying environment.

              Even creating a static page with a simple form was create-react-app for them.

              • hansmayer3d

                I feel your pain, the quotation marks are spot on. It does not help that they are usually former political science or media graduates who decided they will make big bucks in "tech". Very hard to work with those people, just because they entire background is do damn orthogonal to a classic engineering background.

                • whatever13d

                  Which company is willing to develop employees in deep technology for the long run? All of the frameworks were built with the explicit goal of abstracting the engineering part and ensuring they are easy enough for someone with a bootcamp experience to start contributing. Aka chew employees till they burn out and spit them. Rinse and repeat.

                  From an employee perspective, lets say I am a computer scientist, why should I spend precious time to develop myself in the fundamentals of Web when my manager just wants me to pump out React and Express.js code 24/7?

                  And for my promo? Well I will just point out that the system became slow and unmaintainable, propose adopting a new set of frameworks, cash the checks and move on to other pastures.

                  All the incentives are wrong.

                  • hansmayer3d

                    Well, that's the problem - too many people motivated only by the paycheck/career. It used to be different, people without deep technical background were largely doing things they are more competent in, and the software, for all it's troubles, without idealising the past, was a few notches higher quality than today. Myself and a lot of people I know, became engineers because we liked working with the machines. Not because someone offered us a lot of money, that came as a consequence. I couldn't imagine for example retraining myself e.g. to become a lawyer if I had a guaranteed 2x the income I have now. It must be horrible for people who force themselves like that. More than once I've heard frontend "engineers" complaining bitterly about supposed 'unpredictability' of computers, whenever they accidentally switched off some environment variable or something to that order. Just do what you enjoy, money will follow.

            • bluescrn3d

              This was inevitable, now that people are growing up with touchscreens and app stores. 'Content consumption devices' rather than proper computers. And so much digital content competing for their attention.

            • corimaith3d

              To be fair, charging your notebook at 100 all day is going to degrade the battery pretty quick. Using it unplugged until it's low is actually the correct procedure.

              • teeray3d

                I think most charging controllers have decided that holding charge around 70% while plugged in is best.

                • resize29963d

                  It is true that ~70% is a good idea, but most charging controllers are designed to give a full charge because they have no way of knowing when the user wants a 100% battery because they're about to go out or 70% because they're going to be plugged in for a while.

                  This can be changed in software, setting it to 70-80% or having a toggle is best for the battery.

              • avidiax3d

                If you were storing a laptop, it would best be at 50% or so. The battery is under less stress at that state of charge (SoC), so the battery will age more slowly.

                If you have OSX, you can use Al Dente[1] to limit SoC to 70 or 80% while using it to reduce battery aging. There may be similar settings on Windows depending on your laptop's manufacturer.

                If you can maintain a limited SoC rather than running the battery down, that's most preferable.

                Otherwise, discharging lightly (but not below 20% or so) then charging to 80% or so would be a good usage pattern.

                It's helpful to know that many chargers are designed to achieve 1C charge rate (this excludes "fast chargers"). That essentially means they go from 0 to 100% SoC in one hour. So start a 30 minute timer when you plug in electronics to charge, and you'll gain about 50% SoC.

                [1] https://github.com/AppHouseKitchen/AlDente-Charge-Limiter

              • hansmayer3d

                Well, no. It seems to be something that spilled over from the smartphone usage patterns. Because for notebooks which are plugged in, the notebook is supplied from the power network and the battery gets charged only if necessary, by applying intelligent logic. For example my Legion notebook only charges the battery when it's below a certain threshold. Think this is by now the case even with the most low-end notebooks. Plus the non-linear nature of consumption on a developer notebook makes the battery as power supply for serious work un-reliable. Try running a few database containers in your local environment while sitting in a one-hour conference pair-programming with video on and tell me how far you get with that 100% charged battery ;)

            • zahlman3d

              Your point is well taken, but I wouldn't call your anecdote a matter of "basic computer literacy". I've been using desktop computers regularly since the Apple ][ era, but I've never owned a laptop or had to worry about charging one.

        • zosima3d

          This is it. There is a mass-hypnosis in the west where reality at best is being completely ignored and at worst actively treated in a very hostile manner.

          • mattgreenrocks3d

            The full embrace of anti-reality is now quite obvious, with social media's perceived relevance being a symptom of that. Many can't and don't want to cope with reality because it doesn't bend to their will.

        • roenxi4d

          > Something is really off and everyone is telling similar stories about broken processes.

          There are people out there who are pretty conflict-avoidant by nature, and any group tends to pretty significant levels of cohesion because of it. There are some classic stories out there about when it goes particularly bad and spirals into a bad case of groupthink.

          In the economy there are supposed to be some slightly cruel feedback mechanisms where companies (effectively big groups) that get off track are defunded and their resources reallocated to someone more competent. The west has been on a campaign to disable all those feedback mechanisms and let companies just keep trudging on. We've pretty much disabled recessions by this point. A bunch of known-incompetent management teams have been bailed out so they can just keep plodding along destroying value. There is not so much advantage in being honest about competence in this environment, if anything it is a bad thing because it makes it harder to take bailout money with a straight face.

          I cite the Silicon Valley Bank collapse as an interesting case study. A looot of companies should have gone bust with that one because they were imprudent with their money. They didn't.

          • jimnotgym4d

            I think one issue that exacerbates this is concentration of wealth. This has created such a demand for financial assets that their price is ridiculous, no matter how bad the management of those companies is.

            • AnthonyMouse3d

              It's basically the other way around. The companies that are enabled to grow without bound are the cause of concentration of wealth, not the result of it. Who are the billionaires? The early shareholders in megacorps.

          • bix63d

            Companies should go bankrupt because a trusted bank had a bank run thrust on it by competitors? I don’t agree with that.

            • roenxi3d

              You can agree with whatever you like. But if your stance is people should be able to just give money to whoever and it all works out in the end then you aren't supporting an environment where management are honest, because they are being supported in being wilfully blind.

              They're managing capital. If they get bailed out because they turned out to be completely irresponsible in managing their capital then nobody can claim to be surprised that management tend not to be of the highest standard on any axis.

              What is supposed to be the incentive here for appointing competent managers for most companies? It literally doesn't matter. Even company-bankrupting performance will turn a profit once the effects of money printing are factored in.

              • bix69h

                Managing capital is a vital part of any business but a small team of 5 does not have the same resources or requirements as a team of 500 or 5000.

                SVB has been a vital supporter of startups for decades. Why would a resource constrained startup spend time worried about it? Money goes in and out the bank, great, that’s all most startups should need to worry about.

            • zahlman3d

              I don't think that's a fair characterization of what happened. See e.g. https://www.youtube.com/watch?v=WowVQ4rhbt8 .

              • bix63d

                My comment is somewhat tongue in cheek. I understand the underlying issue. The back channeling during that time was crazy though.

        • choeger4d

          I can confirm nearly everything you say, and I'd like to add that it's a cultural phenomenon. We don't seem to value competence anymore. I cannot recall when I've heard someone say something positive about another person's competence. Be it a craftsman, industry worker, knowledge worker, or even a teacher. There doesn't seem to be any value in doing a good job.

          • MaKey3d

            I do it. This week I congratulated a dentist on his great job at a gum transplant. I love it when people are highly competent at their profession.

            • bluGill3d

              Is your dentist competent or good at making you think he is? you shouldn't see your dentist enough to know.

              • MaKey1d

                He's a periodontist, I've only seen him for the two gum transplants I had. It's a delicate procedure and the results were great both times, so to me he is competent.

        • lenerdenator3d

          > My personal theory is that this is the result of an incompetent management class where no self corrections are happening.

          Close. They're not incompetent; we just redefined competence.

          It used to be that competence was a mix of a lot of distinct, but interdependent, qualities. The end result was synergy that allowed for people and organizations (including companies) to compete and move society forward.

          In the 1970s, we started to allow a bunch of psychopaths (I'm saying this in the clinical sense) to redefine competence. Instead of this array of distinct qualities, they just defined it in terms of ability to create monetary value, particularly if that value was then transferred to shareholders. That was it.

          We also switched to quarterly reporting for for-profit companies, shrinking the window to evaluate this new definition of competence to 90 days. Three months.

          An end result of this was that you could simply do whatever made the most money in 90 days and be considered competent.

          Jack Welch was the paragon of this. GE shareholders saw massive gains during the latter half of his tenure at the helm. This wasn't because of groundbreaking new products or services; quite the opposite: Jack realized that selling off divisions and cutting costs by any means necessary was a good way to make money in the 90 day period. Institutional knowledge and good business relationships in the market - two of the elements of the former definition of competence - were lost, while money - the sole element under which competence was judged in the new definition - went up.

          You also had managers doing a lot of the avoidance of real management, like you speak of. Instead of betting on a new product or trying to enter a new market, they took a Six Sigma course, learned a bunch of jargon, and cut costs at the expense of business past the 90 day period.

          If you do this enough (and we did, far beyond just GE), that expense is taken at the societal level. Existence extends beyond 90 days. You can't mortgage the future forever. It's now the future, the payment is due, and we have an empty account to draw from.

          Theoretically, we could go back to a more in-depth evaluation of competence and reward its display over the long term. In practice, there are a bunch of people who got unfathomably wealthy off of the shift to the "new" competence, and now they're in charge and don't want to switch back, so we won't.

          • gen2203d

            In the Haudenosaunee system of governance, whenever they needed to make a consequential decision, the family-clan-appointed leaders would nominate some sub-group of the circle to represent the interests of the unborn 7 generations in the future. That's far enough into the future, ~100+ years, that the youngest person alive today to experience decision would certainly be deceased before the generation is born.

            On a long enough time scale, short-term oriented systems naturally-select themselves out of existence. The U.S. Constitution didn't survive 7 generations. The Civil War was in 1865 (77 years, ~4 generations). Reconstruction Era made it maybe 60 years (3 generations), as far as the Great Depression / Dust Bowl.

            The current post-war ordering of the interest of short-term capital above all else doesn't have a well-defined start date, but 1968 (MLK Jr, RFK, nomination of Humphrey) is a solid one. We're hardly 3 generations in, and it doesn't feel great.

            Really, when you look at American history, the periods endowed with bouts of long-term thinking are really quite rare (1770-1810, 1880s-1900s, 1930s-1950s). Maybe we're due for another one.

            • ViktorRay3d

              The U.S. Constitution didn't survive 7 generations. The Civil War was in 1865 (77 years, ~4 generations).

              I know the 13th, 14th, and 15th amendments to the US Constitution are often considered America's Second Founding because they legally eliminated [1] all the elements of racism within the United States Constitution, but saying that the Constitution "didn't survive" doesn't see accurate...

              [1] That being said we all know that it took many many decades after those 3 amendments for the laws in the United States to accurately reflect the principles embodied within these amendments.

              • gen2203d

                The systems of governance pre-ACW and post-ACW were two distinct systems. The pre-ACW was essentially two competing systems of power duct-taped together with the 3/5ths amendment. The post-ACW was one dominant system of power that had beaten another into submission and annexation.

                The 3/5ths compromise, and its implicit enshrinement of slavery as an American institution, is as an example of short-term thinking (compromising on the legal definition of a human being, in order to get the constitution ratified) that eventually caused the greater system to unravel. Hundreds of thousands of people died in the civil war, millions of people experienced slavery. It could have been avoided if longer-term thinking prevailed.

                I hear you that the Constitution (inclusive of its self-mutating property) survived as a useful document of federal governance. This purported maintenance of a federal union was a huge legitimizer of northern domination of the post-ACW United States. But, I think you'd agree that the "system of governance" that begat the constitution did not survive, that's more what I was getting at. That each successive system of governance can still legitimately claim to be implementing the U.S. Constitution is indeed impressive.

                • ViktorRay3d

                  Ah I see what you’re saying. That makes sense.

                  I am not a lawyer but many years ago I read about the following doctrine.

                  https://en.wikipedia.org/wiki/Incorporation_of_the_Bill_of_R...

                  Basically prior to the American Civil War the Bill of Rights was considered to only apply to the Federal government and not the state governments.

                  After the Civil War, the US Supreme Court interpreted the 14th amendment such that overtime all the amendments of the Bill of Rights were considered to apply to the states as well.

                  So what you are saying about one system being dominant over the other system (Federal government being dominant over the state governments) makes sense and it is something that seems to have happened more extensively after the Civil War.

            • immibis2d

              And the current incarnation of capitalism began in the 2008 financial crisis.

          • nisa3d

            Strangely enough my experiences are mostly from smaller companies that are not public traded in Germany - so very different from a typical US public company, my bosses might had sociopathic traits but I doubt that it's true for all of them - it's more the complete disregard for the product quality and disregard for investing in your employees and the inability to solve more complicated issues that is pervasive through all these gigs. I'm talking about terrible UI and software bugs. Not some small debatable things.

            Maybe it's really about wrong incentives and lack of technical excellence.

            Government money keeps coming in and making it broken and buggy actually assures ongoing contracts. Investment in skilled workers or solving technical issues is not paid for and everyone - company and customer are completely disconnected from the end user and feedback mechanisms are broken or manipulated.

            It's maybe a mix of all the different answers my post got.

        • nostrademons3d

          They're responding to incentives. The only user that matters is the marginal user, the person who didn't previously use your product but now does. They even teach this in MBAs and economics classes. And so the only efforts that matter are those that create a customer, and hence management spends a great deal on promotions, marketing, new customer discounts, advertising, gamification, addictive usage mechanics, lock-in, etc but basically zero on making existing customers happy. It's almost better if they aren't happy - an "ideally run" company is one that has users who hate your product but don't hate it quite enough to quit using it (or if they do, they have no alternatives).

          Enshittification in action.

        • amadeuspagel3d

          If only people were allowed to start their own companies.

          • johannes12343213d

            You can start your own company, but in today's world you are immediately in a global competition. Not only in software, but also many forms of production a Chinese 3d printed thing will be shipped cheaply to your customers, for many services one competes with cheap labor ...

            You got to offer good quality and stand out, which isn't easy without capital.

            • snozolli3d

              You're also competing with multinationals that can exploit tax loopholes and attract tax incentives and grants.

              • hylaride3d

                > You're also competing with multinationals that can exploit tax loopholes and attract tax incentives and grants.

                Not only that, but often with whole government backed companies where the government will gladly support them and even participate in espionage to gain competitive advantages. Huawei is the classic example, but is just the tip of the iceberg.

                Meanwhile in most of the western world, executives are focusing on the next quarterly results...

              • Hoasi3d

                Competition is for losers. All you have to do is create value in an entirely new niche category that you will own by offering something that didn’t exist before and that people want.

                • johannes12343213d

                  Till the multinational corporation with deep pockets decides to go after your niche. They don't have to be profitable inside that niche, while you have to.

              • eppp3d

                Who have also captured regulators and politicians and use them to cement their advantage by making things too costly and difficult for new startups to compete.

            • manishsharan3d

              Also most of the buyers prefer to buy from their preferred vendors. Good luck trying to get on that list. Your products will have to be priced at a fraction of your competitors for them to even evaluate you.

              My employer buys a crap load of crap stuff from Broadcom just because the procurement is easy.

          • jimnotgym3d

            If they have the capital...

        • mitjam3d

          It’s the age of thinking instead of doing. Thinking doesn‘t solve doing problems but we can think and talk them away or at least outsource the doing. —- Hmm, what an interesting thought. Let‘s think about it some more.

        • forgotoldacc4d

          One big problem is becoming a manager is seen as the end goal, and pay often reflects that.

          Being a great engineer or researcher doesn't pay. You won't get your name known for your work. All your achievements will be attributed to whoever manages you at best, or attributed to the corporation above you with not a single human name at worst.

          People like being recognized for their work. Every great achiever wants to have their name remembered long after they leave this world. Everyone wants to be the next Isaac Newton. The next Bill Gates. The next Steve Jobs. The next Elon Musk. It's a constant downhill path from being known for using your brain and busting your ass to discover or create something, to being known for managing someone who created something, to being known as someone who bought the company that managed people who created something. Motivations are all fucked up. No matter what you discover or create these days, there's a feeling that you're not going to have your name written in history books. Your best options are join a grift or manage someone who's doing the hard work.

          • bee_rider3d

            Maybe…

            I dunno, there’s something in the fact that Isaac Newton the imaginary cultural figure was hit on the head by an apple, and then invented calculus.

            Meanwhile Isaac Newton the actual guy (recalling from memory so feel free to correct) was a bit eccentric (dabbled in alchemy and other mystic arts), had some academic posts, some government jobs, and built Calculus on work that was ongoing in the academic community…

            The imaginary Isaac Newton and the imaginary Elon Musk look sort of similar. Because we ignore the boring work that Newton did and the fact that Musk just bought his way around it—their real versions look very different of course! But if you want the actual day to day experience of being Isaac Newton, you can, just go be a professor and make some quirky friends.

            • ttw443d

              This is actually a very interesting point and highlights the fact that most of the famous polymaths only started being talked about long after they were dead (good example is Maupertuis and his work on action in physics).

        • fungiblecog4d

          this is exactly my experience

      • Art96814d

        You're just getting older and looking at the past with rose colored glasses. No one is going backwards in capability. It is about how accessible and cheap the thing is. In the 90's, a license to install Maya or 3D Studio Max, or Lightwave was extremely expensive, those products were not promoted nor available to the general public. They would cost tens of thousands of dollars, for the software alone, not to mention the hardware.

        Today it is a commodity. So we are flooded with low effort productions.

        With that being said, we have more capability than ever, at the cheapest cost ever. Whether businesses use that wisely is a different story.

        There will always be outliers. I see many comments with people who derived value from whatever they perceived as something uncommon and unique they could do. Now AI has made those skills a commodity. So they lose their motivation since it becomes harder to attain some sort of adoration.

        In any case, going forward, no matter what, there will be those who adopt the new tools and use them passionately to create things that are above and beyond the average. And folks will be on HN reminiscing about those people, 30 years from now.

        • mitthrowaway24d

          But for example Toy Story (1995) had a budget of 30 million. Today's Disney box office flops have budgets closer to 250 million.

          • jkaptur4d

            I think that’s the rose colored glasses again. What made you choose Toy Story as an example rather than Waterworld or Cutthroat Island?

            • mitthrowaway24d

              Because the thread was discussing CG becoming a commodity and Toy Story was the first thing that popped into mind for 90s CG; I have a vague recollection that it was the first feature-length full-CG film.

              I only checked its production budget while writing my comment.

              • ipsum24d

                You're illustrating his point, that you're using survivorship bias to cherrypick good CGI movies from the 90s.

                • mitthrowaway24d

                  Actually, I picked the first CGI movie from the 90s, and it just happened to be good and very cheap.

                  But more importantly, the other half of my point was that $250 million ought to be enough to pay for a high effort production. It's not like "well Blender is free now so of course theatres are flooded with amateur CG films since their production has been commoditized".

                  • physicsguy4d

                    It was the first full CGI movie but others had been using it before that, Jurassic Park used a mix of CGI and puppets for e.g.

                  • ekianjo4d

                    30 millions was nowhere cheap in the 90s. I guess inflation makes things look this way.

                    In the same timeframe Jurassic Park cost twice more to make and it was a very expensive movie at the time.

                    • KvanteKat3d

                      Correcting for inflation (I used this tool by the US Bureau of Labor Statistics: https://www.bls.gov/data/inflation_calculator.htm), 30M USD in nov. 1995 would have a purchasing power equivalent to roughly 62M USD in feb. 2025. This is below half the budget of Moana 2 (150M USD, released in nov. 2024) for instance.

                      • ekianjo3d

                        I would never use the official inflation numbers (they underestimate the actual inflation). It's easy to see that the most expensive movie ever made back in the day has a much lower budget that the most expensive movie made now, even adjusted for the official inflation rate.

                  • milesrout4d

                    But it wasn't very good. It was good for the time but if something of that quality came out today it would be a joke.

                • ristos3d

                  There does seem to be a sort of sampling bias thing that I've only recently noticed, that I think does come from being older now. I started to get back into old retro games I used to play, and I can't help but realize how many games back then were really bad, like not worth playing at all, and I just cherry picked the good ones. And being older, I'm not into gaming anymore, or really much of a consumer at all besides essential goods, being younger you do consume more entertainment products, like games. So I think there's definitely some sampling bias going on here where things look like they're getting worse. Or it could be both things, like it could actually be getting worse, but also not as much as it looks like because of this sort of sampling bias thing. Like having to have multiple accounts, like a Switch account plus some special Switch account and/or another account to play a game, or you buy a game and then there's an online store as well, or you buy a game in person but you can't get a copy digitally, or you buy a digital copy and you can't get a physical copy made for you for a flat fee, or that increasingly people don't actually literally own things anymore and it's all subscriptions or some sort of permission to use, or that a lot of games are just remakes of older games, or that you can't play single player offline, or that you can't transfer or give your digital game that you "bought" and "own" to someone else (less it be a physical copy, obviously), etc.

                  • johnnyanmac3d

                    I mean, I see no issue with comparing high profile old games with high profile new games. The thing is thst there's less high profile bad games becsuse... Well, back then when you put in that money you werre trying to go for quality, I suppose.

                    It also was because development budgets were microscopic compared to today, so a bad release from a dev team of 5 people and 12 months won't bomb as badly as a 500 person 5 year "blockbuster" release. So yeah, Superman 64 was laughably bad but didn't sink a company the way Condord or even a not-that-bad game like Saints Row would.

                    Economy is different, as is the environment. There's still quality, but when a game flops, it's a tsunami level flop and not just a painful belly flop.

              • casey23d

                If the neighbor kid (Sidney "Sid" Phillips) from toy story appeared in a modern movie of a similar budge (not even inflation adjusted) people would comment about the bad CGI.

                Toy Story was a good idea because attempts at depicting humans with CGI at the time had a very plastic look.

            • johnnyanmac3d

              Maybe thst people didn't want to nitpick the most expensive film of its time? We can compare Waterworld to The Force Awakens if you wish, I suppose.

            • getpokedagain3d

              Waterworld is my partners favorite film of all time.

        • somenameforme4d

          This is a tangent, but I don't think the cost of things like 3DS or Maya were ever major barriers to entry. They were widely available for 'free' download. I think the companies involved were basically using this as what would eventually become the modern 'free for entities with less than $xxx annual revenue' license as there was seemingly less than 0 effort to ever enforce their copyrights. To say nothing of the countless commercial books available for both, which simply would not have had a market if it was only selling to people who had real licenses for the software.

        • johnnyanmac3d

          If it's a commodity why is everything worse in quality? Commodification doesn't explain drop in objective metrics like performance, security, and complexity. It doesn't even explain the decline in stuff like customer satisfaction.

          I don't think talent is the problem either. There's a lot more talent now than in the 90's.

          • dkislyuk3d

            I think commodification is directly tied to a perceived drop in quality. For example, if the barriers to making a video game keep going down, there will be far more attempts, and per Sturgeon's law, the majority will be of low quality. And we have a recency bias where we over-index on the last few releases that we've seen, and we only remember the good stuff from a generation or two ago. But for every multitude of low-effort, AI-generated video games out there, we still get gems like Factorio and Valheim.

            • johnnyanmac3d

              Surgeon's paw was true in the 90's and is true in the '20s. There's not much point in comparing the crap to the crap. The only big difference is that it is easier to see the bottom of the barrel in your most popular storefronts with a click l(even on "curated" ones these days woth PSN and the eShop) instead of going out of your way to find some shareware from a Geocity that barely functioned.

              Thing is those high profile disasters are still supposedly the "cream of the crop". That's why they get compared to the cream of before.

              Popular examples are easier to exemplify as well instead of taking the time to explain what Blinx the Cat or Midnight Club are (examples of good but not genre-defining entries)

        • yubblegum4d

          Boeing calls to say hello...

          • SR2Z4d

            This is an excellent example. The 737 used to crash a LOT MORE than it does now, and even the version with MCAS is a safer airplane than your average 1990s jetliner (which was also probably a 737, which is kind of my point)[1].

            So it's a slightly less safe (in the grand scheme of things) airliner that's vastly more fuel efficient and cheaper to run than any in the past. Obviously this is of no comfort to the families of the people who died in the crash!

            But to suggest that Boeing has somehow regressed decades in technical capabilities is just plain wrong.

            1: https://en.wikipedia.org/wiki/List_of_accidents_and_incident...

            • sidewndr464d

              When you look at how type acceptance works, all Boeing planes are going to be a "737". When you actually look at the 737 MAX and compare it to earlier aircraft, it's pretty obvious that they've created a whole new aircraft. Hence MCAS trying to pretend like that didn't happen & dead people.

            • hylaride3d

              It wasn't that the 737 MAX isn't (or can't be) a safe aircraft, but that Boeing prioritized making it fly like its predecessor over all else - including safety.

              The fact that two brand new MAX's crashed killing all aboard within 2 years of its commercial introduction (out of only ~600 models flying at the time) is a brutal safety record for the jet age, especially as the cause of the crash was the plane itself. That list you post includes any and all reported incidents that merely involve 737s (and involve incidents that were caused by factors that aren't necessarily related to the safety of the place itself).

              • SR2Z3d

                > It wasn't that the 737 MAX isn't (or can't be) a safe aircraft, but that Boeing prioritized making it fly like its predecessor over all else - including safety.

                It's misleading to say they prioritized making it fly like its predecessor over safety.

                In theory there was absolutely nothing wrong with a system LIKE MCAS. In fact the 737 MAX is still approved to fly with it.

                The flaws were in the specific implementation and documentation around it, not with the idea of the system itself.

                > The fact that two brand new MAX's crashed killing all aboard within 2 years of its commercial introduction (out of only ~600 models flying at the time) is a brutal safety record for the jet age, especially as the cause of the crash was the plane itself.

                If you want to be pedantic about it, the reason for the crashes is that the pilots failed to recognize trim runaway during takeoff. The trim runaway was caused by MCAS, but this is not a new failure mode for ANY aircraft and pilots get extensive training on how to manage it [1].

                MCAS failing was not an unrecoverable error [2]. It failed several times in the US, as well, but American pilot training standards are very high compared to the places where there WERE disasters and the pilots recognized and recovered quickly.

                I say this not to deflect blame from MCAS. Its original implementation was unsafe and should never have been approved.

                A large part of why modern jetliners are so safe is exactly because of flight control augmentations like this - both Boeing and Airbus have been implementing these for decades and they have made flying much safer. Your suggestion that any system like MCAS is always unsafe (or that Boeing was somehow doing something wrong by adding it) is totally wrong.

                1: https://www.aopa.org/news-and-media/all-news/2017/july/pilot... 2: https://www.nytimes.com/2019/09/18/magazine/boeing-737-max-c...

                • ApolloFortyNine2d

                  >A large part of why modern jetliners are so safe is exactly because of flight control augmentations like this - both Boeing and Airbus have been implementing these for decades and they have made flying much safer. Your suggestion that any system like MCAS is always unsafe (or that Boeing was somehow doing something wrong by adding it) is totally wrong.

                  The common person often doesn't realize this at all. Every modern plane is flying itself essentially, with hints from the pilot on what to actually do.

                  >MCAS failing was not an unrecoverable error [2]

                  Also this is frustrating, especially in the case of the second crash where every max pilot knew the procedure (including the one that crashed), they even performed the procedure but then disabled it a minute later. Both the NTSB and the BEA (French equivalent) agreed pilot error/CRM played a role in the second.

                • hylaride2d

                  > It's misleading to say they prioritized making it fly like its predecessor over safety.

                  The internal Boeing emails literally say otherwise.

                  > In theory there was absolutely nothing wrong with a system LIKE MCAS. In fact the 737 MAX is still approved to fly with it.

                  I never said that MCAS had any issues in theory. And the 737 MAX was mostly "approved" by Boeing's self-regulators, where emails trails (again literally) had anybody raising questions or concerns sidelined.

                  > The flaws were in the specific implementation and documentation around it, not with the idea of the system itself.

                  Yes, because Boeing's top priority was making it so that no expensive extra training was required to fly the MAX, despite the fact that MCAS was designed to deal with some situations that could cause the plane to fly differently.

                  > If you want to be pedantic about it...

                  Yes, I am being pedantic about it. The trim issues in the crashes were (intermittently) caused by MCAS, but there was no specific documentation or specific training as to how to deal with it in the case of faulty MCAS sensors. There were indeed several MCAS incidents in western flights, but they were different as the failures were different. The two crashed pilots did indeed attempt disabling MCAS but the intermittent failures masked the problem and there were insufficient checklists by boeing, because had they existed it could have allueded to the fact that such situations may need new simulator time.

                  The MCAS issue was totally and completely recoverable if it were properly documented, but doing that would have almost certainly guaranteed the simulator time that was Boeing's top priority to avoid.

                  Almost all the reports about pilot capability differences had more to do with experience than it did training. These "developing" countries have younger airlines and pilots who don't have the same pipeline of pilots with decades of experience, including the military like in the US. MCAS "acted up" on several other Lion Air flights that the pilots corrected for as well, but again those were different failure modes.

                  The fact that the Ethiopian Airlines had perfectly acceptable safety record on other planes negates that these are "poorly trained" pilots. They've had one major accident in 2010 that was attributed to pilot error, but most of the rest were due to bad luck (eg bird strikes) or hijackings.

                  > A large part of why modern jetliners are so safe is exactly because of flight control augmentations like this...

                  I never even mentioned MCAS by name. Yes, modern jetliners are safe because of these kinds of systems. Airbus planes will not allow pilots to do many things no matter what, even. But these systems are documented, pilots trained on them, and go through rigorous testing because in most cases they're designed to make a plane safer, not try to deal with aerodynamic changes.

                  Boeing wanted no new simulator training despite the MAX being a very different aircraft due to changed engine placement. That was the cause. If Boeing wasn't trying to avoid new simulator training the 737 MAX is a perfectly fine aircraft as far as we know.

                  The 787 had similar issues as the overriding goal of the program was to get as much capital expenditure off of Boeing's books, but all of the outsourcing led to a nightmare when trying to assemble the plane and there was no unified quality control program, or even a straight line of responsibility.

            • oxfordmale4d

              The problem isn't that Boeing is less safe; it is that the company's culture shifted to the extent that technical staff could no longer report perceived safety issues.

            • gcanyon4d

              Thanks for that list/link. It's so easy to forget how awful the past was.

            • ekianjo4d

              This is an excellent example of design failure. So in reality, yes, Boeing has regressed as 90s Boeing would never have released a plane like that.

            • uppost4d

              The market is very wrong if that's the case you can get rich easy buying BA at a discount. Please do!

              • echoangle3d

                Maybe the world has just moved on and standards from the past just aren't as valuable now as they once were?

                Boeing being better now than in the 90s doesn't mean that the stock shouldn't drop, because competitors and expectations are higher now than in the 90s.

          • Art96814d

            All I know is more often than not, when I travel by air, I am more than likely going to be in a Boeing aircraft. I also know more planes fly today than ever. So it is no surprise that in a hyper connected world where everyone with a phone and a camera can document every single failure in any service or product, that it would be perceived that things are going backwards. If we had this in the 80's or 90's it wouldn't be much different.

            Did our quality and capability get worse or did everyone become a journalist that can document every flaw and distribute it globally in minutes?

            Hmmm....

            • SecretDreams3d

              QC hasn't scaled with volume and robustness introduced by better engineers (now retired) has been peeled back by juniors and managers that didn't understand the designs in the first place and are chasing mass and profit instead.

              • trollbridge3d

                QC is one of the first things to get cut. In software eventually customers punish you when your software is so bad that they can’t really use it at all. Aircraft used to be regulated, but Boeing somehow carved out “self regulation”. It’s the same problem as how washing machines don’t last as long as they did 20 years ago.

            • yubblegum4d

              Hmmm... OK, Airbus calls to say hi, too.

            • johnnyanmac3d

              Our capabilities didn't get worse. Management caused a shift in culture to make things worse by cheaper. We call it enshittification these days, but it'd decades in the making.

              All this discussion assumes that Boeing engineers didn't catch this stuff and weren't banging the alarm bells over how these completely failed inspection. The problem was the people in power ignored it. This is an entirely social issue constructed by business demands, not one lacking expertise nor standards.

          • anon-39884d

            Yea lets take that flight with airplanes from 30 years ago shall we

            • bronco210164d

              Check the registration on the next aircraft you take a ride on ;)

              30 years may be a stretch but 20-25 certainly isn't.

              • somenameforme4d

                Counter-intuitively this is especially true with international flights... The main stressor for a plane is not like a car, where it's miles driven/flown. Its in pressurization/depressurization. And so a plane doing domestic skips an hour or two away will wear out way faster than one doing transatlantic trips, and so you're more likely to see the shiny new plane on a short domestic trip than on a big international one.

                Incidentally this also applies similarly to risk issues. The biggest risk in a flight is not in flying, but in takeoff/landing. This is why the commonly cited deaths/mile metric is not only misleading but completely disingenuous by the people/organizations that release it, knowing full well that the vast majority of people don't understand this. If some person replaced their car with a plane (and could somehow land/take off anywhere), their overall risk of death in transit would be significantly higher than if they were using e.g. a car. 'Air travel being safer than cars' relies on this misleading and meaningless death/miles statistic.

                • MaPi_4d

                  Sure, replacing the car with a plane for your grocery shopping would be probably more dangerous, but do you have any data at what distances do the risks flip?

                  When I see those statistics I think about flights like Austria to Finland and I imagine that is indeed safer by plane.

                • laszlojamf4d

                  This is really interesting! Never thought of that. Do you have a source for these facts?

      • bko4d

        I was thinking about examples of where things got worse over time. They include some common appliances that use water, due to water use regulations. No reason my dishwasher should take over 2 hours to run. But then there's other things like food delivery.

        I used to deliver pizzas in the early 2000s. I would get paid

        $4/hour (later bumped to $5 per hour)

        $1/delivery (pass through to customer)

        + tips

        I had good days / times where I was pretty much always busy and made around $20/hour by the end.

        So delivery cost the customer $1 + tip (usually ~$3), cost the business maybe $40 a night (~2.5 drivers for 3 hours), and I made out pretty well.

        I can't compare exactly but I feel like today the business pays more, the customer pays more, the drivers get paid less and it's all subsidized by investors to boot. Am I totally wrong on this? But I feel like delivery got so much worse and I don't know where the money is going.

        • rurp4d

          I'm glad I'm not the only one noticing appliances getting worse across the board. I don't buy enough of them to really know if the trend holds overall but the correlation is pretty much perfect between how new an appliance is and how much I hate it. For example the controls on my new LG washer and dryer are incredible bad. They've made it hard to impossible to just set the run level manually to push you into set programs for bedding or whatever. But they never work right! We've given up on using those programs entirely because they are terrible.

          The main culprits I've seen are cheaping out on quality, replacing traditional controls with touch screens or "AI" magic buttons, and squeezing in more monetization streams or adding gimmicky features that actively make the product worse.

          Maybe things will turn around someday. There are a few rays of hope, like the touchscreen fad in cars gradually losing its luster, but it seems like we've been on the wrong path for a long time and I'm not sure it will ever correct.

          • throwing_away4d

            I'm also a software engineer and recently got new LG washer and dryer.

            I have not yet figured out how to manually change the settings, as the buttons don't do anything when you press them.

            I leave it on "normal" and it seems fine, and surely there is a way to activate those buttons, but I haven't found it.

            I could probably install the app on my android device and use it to connect them to wifi, where I could presumably configure them.

            Instead, however, I am looking at electronics-free diesel trucks.

            • mmierz4d

              I have an LG washer/dryer as well. On mine, you need to rotate the large central knob by one "click", then the buttons start to work.

              Why? No idea.

            • resize29963d

              How's that diesel truck research going? I've thought about getting into auto repair as a hobby, but wrenching on modern cars would basically be my day-job with added wire fiddling.

              • throwing_away2d

                I learned that most diesel cars and trucks are still dependent on electronics for fuel injection and also have a higher minimum-quality of diesel.

                The prepper nerds seem to advocate for Cummins 12-valve engines from the 1990s or the Toyota 1HZ.

                There's a whole lot of old diesel LandCruisers out there. I'm guessing that's the sweet spot for it still being a normal car that mechanics can maintain while still being comfy and looking cool.

            • ahartmetz4d

              You can avoid buying crap with obfuscated UIs... mostly.

            • theendisney4d

              Maybe a washer can be made without electricity. Have a tank of cold compressed air and use a blow torch to heat it and make new compressed air.

              • _heimdall4d

                Look up the Wonderwash. It basically looks like a small propane tank on an axle that you manually spin to wash your cloths. It works surprisingly well for washing cloths without electricity.

                • rurp3d

                  How long does it take to manually dry a normal load with that thing? My dryer takes at least an hour to dry a modest sized load of normal clothes so I would expect a non-powered one to take many hours, but maybe my modern dryer is really so bad it doesn't save much time.

                  • _heimdall3d

                    With the first few loads I wrung cloths out by hand and tossed them in the dryer. I didn't notice it being longer, but I also didn't time it so 20-30% longer wouldn't surprise me.

                    After a few loads I bought a hand crank cloths wringer, basically two rollers that squeezes the water out. That thing honestly works better than a spin cycle, cloths are more dry than coming out of the washer and I have noticed the dryer finishing faster (I usually run it on an auto sense mode rather than a timer).

        • maxsilver4d

          > I can't compare exactly but I feel like today the business pays more, the customer pays more, the drivers get paid less and it's all subsidized by investors to boot. Am I totally wrong on this?

          It is exactly that! Food delivery is an excellent example of 'things just got worse'.

          In 2019, 'delivery' was a specialty a restaurant would have to focus on to offer. Pizza places (Papa Johns, Pizza Hut, etc) and other specific delivery-focused restaurants (such as Panera Bread, Jimmy Johns, or your local Chinese restaurant) would have actual W2 employees who did delivery driving, as part of their job. The restaurant would want deliveries to go well (for both the customer, as well as the driver), so would make sure their own staff had reasonable access to food, some light training, and would ensure they could deliver it somewhat well. (They would reject orders too far away, they wouldn't serve food that wouldn't survive a delivery trip well, etc)

          In post-COVID 2025, "every" restaurant offers delivery, but almost no restaurant still employs their own delivery drivers (locally, Jimmy Johns might be the only one left). Everyone else just outsourced to DoorDash. DoorDash drivers are employees who are 'legally-not-employees' (1099 employees), so they no longer have any direct access to the restaurants, and they can't train well for any specific service, because they might have to visit any-of-50 restaurants on any given day, all of which have entirely different procedures (even if they are the same brand or chain). Restaurants have zero incentive to ensure deliveries go well (the drivers aren't their employees, so they no longer care about turnover, and customers have to use DoorDash or Uber Eats or equivalent, because almost every restaurant uses it, so there's no downside to a DoorDash delivery going bad).

          Prices to consumers are double-to-higher than what they were in 2019, depending on the restaurant. Wages are down, employment security is entirely eliminated. Quality and service have tanked.

          Presumably, investors make slightly more money off of all of this?

          • Cthulhu_4d

            And on paper the idea for services like Doordash was good - 3rd party delivery company so the restaurant doesn't have that liability, staff, or investment, delivery people aren't working for just one store so they have more work if the one restaurant is quiet, etc.

            But since it's all investor and profit driven for the bigger company, costs get cut on every side.

            • disgruntledphd23d

              > But since it's all investor and profit driven for the bigger company, costs get cut on every side.

              These services basically don't work with Western level wages. The economics are just not there.

              • maxsilver3d

                > These services basically don't work with Western level wages. The economics are just not there.

                Only because the service is disrupting the business model for those wages.

                It worked well(ish) in 2019, it failed by 2022. It's not some kind of mystery around wages or inflation, the introduction of these services (and their popularity and growth, due to COVID closing in-person restaurants for a while) is the thing that killed the economics around delivery, for much of the US.

              • johnnyanmac3d

                But it worked 20 years ago. That's the issue. We didn't get worse at driving and (apparently...) we didn't get poorer over the last few decades to pay for stuff.

                Cost of living from the fallout of '08 simply skyrocketed and most of the country didn't not increase compensation to make up for that. Despite that company simply charged more while cutting costs at the same time. So the driver and the customer lost out.

                • nbaugh13d

                  I think by "these services" they mean apps like Doordash, not delivery as a concept

                  • immibis3d

                    And they're saying: Why couldn't Doordash work when pizza delivery worked 20 years ago? Doordash is just pizza delivery scaled up, right? If pizza delivery had continued, it should work the same way as doordash for the same price as doordash but limited to pizza restaurants (by definition), right?

                    • johnnyanmac3d

                      It is odd.. It's such an easy middlemen to apply and should theoretically be operated on modest margims. Instead it sounds like they for this stupid extortion of a cut behind the scenes which ruins it for everyon. Based on Cowto's operations, few would complain about a 14% uptick on delivery (and the tip was already culturally accepted. No more work to do there). Instead, you can double your meal price nowadays.

                      I know much of the answer is a mix of private equityb and an overload of debt taken from insane evaluations.

                    • disgruntledphd23d

                      For a few reasons:

                      1. Pizza travels very very very well

                      2. Pizza is pretty cheap to make

                      3. Wages (and costs of transportation) were lower 20 years ago.

                      More generally, delivery as a model can work, but not when you have an organisation of really expensive engineers/salespeople working on a frontend to it.

              • pc863d

                DoorDash is just a means to extract revenue from users, contractors, and restaurants and give it to the owners of DoorDash. It's not a sustainable business and I'm not convinced it was ever intended to be. COVID was a fluke that made it hang around a lot longer than it otherwise would have.

                Edit: ooo someone's mad I don't like DoorDash

                • Workaccount23d

                  The killer feature of doordash/ubereats/grubhub/whatever is the ability to shop for food from a ton of different restaurants in one place with one payment button.

                  Back when I was doing food delivery before the pandemic, we would actively promote placing orders with our restaurant directly. I would tell repeat doordash customers that they can save 15% if they just call or use the website.

                  None of them converted. The convenience of the app is just too strong for people to care.

                  • pc863d

                    The only time I've seen people care is if you lay out an identical order and show the percentage increase over ordering delivery through the restaurant, or picking it up yourself.

                    DoorDash adds upwards of a couple dollars to every item. They charge a 5-10% service fee depending on if you pay them monthly. The default tip options are pretty egregiously high - it's not uncommon to see double-digit tips in all three options. I once saw a $22 tip in the top option for a single bag of food with no drinks less than a 10 minute drive away but that's likely an outlier.

                    All in if you don't have DashPass you're easily looking at a 30-40% increase if you get cheaper items which are more likely to be marked up only $.5-1 but represent a larger percentage of the total.

                    Nobody in their right mind would tip a delivery driver 40% of their entire meal, why are you happy to give most of that to a corporation that is doing very little for the transactions?

                    Edit: I just did this for an example order for a nearby restaurant - 1 appetizer and 1 entree so probably good for two people not super hungry to share or one hungry person to eat.

                    $31.59 in food, $2 delivery fee and $5.50 in fees (I subtracted sales taxes manually). This restaurant is 5 miles away and it's 11:30 local time. Tip suggestions are $9.50 (30%), $11.50, and $13.50 (42%). So at the lowest suggested tip amount, which is offensively high, you're looking at $49 before sales tax.

                    The exact same order is $26.36 including an online order service fee but before sales tax. Even if you were going to get it delivered and tip the driver 30% you're still saving a ton of money and this is on one meal with enough food for 1-2 people.

                    The appetizer alone is $10 on their website and $14 on DoorDash. It's a crazy system and I can't believe how much money people burn on this every year.

                  • getpokedagain3d

                    The killer feature of this and like 90% of apps in this market capture model is laziness. They realize if you can make something easy for someone they will do it even if it’s not quite what they want.

                    Users want good fresh food delivered at a reasonable price. But they are willing to tolerate shitty cold food at an expensive price because it’s a tiny bit easier to do than picking up the phone.

                    We are lazy by nature to preserve energy and many many companies have just perfected finding the right ratio of how fucked we will allow our selves to be to tickle that lazy button.

                  • kccqzy2d

                    Yes but having one payment button doesn't mean the service needs to manage its own delivery.

                    Consider the example of Amazon marketplace. You still have one payment button and you can still shop for things from different vendors. Yet the order fulfillment can be done by Amazon or the seller directly.

                    If such an arrangement is possible on Amazon, it must mean that there are shops that trust their own fulfillment more than they do Amazon's. It is entirely possible that some restaurants will want to own that delivery experience as well.

                  • mathteddybear3d

                    Yeah, but this convenience goes well beyond the "one payment button".

                    If you order food directly, you won't have the delivery tracking on the map. Even within the app, if the restaurant provides their own couriers, you lose the visibility and arrival ETA info.

                    And 15% might look impressive, but if you are getting your food from a delivery app, you probably don't care that much about food price in the first place.

          • msabalau3d

            Based on your description, consumers also get a wider range of delivery options.

            Also, it is a little odd that people are dining out more often if the experience is worse.

            But, yeah, wages and employment being down is the most relevant change.

          • jimnotgym4d

            Probably not the investors, but the Private Equity managers always win

            • kccqzy2d

              There are plenty of other businesses wrecked by Private Equity but DoorDash isn't one of them. It's publicly traded.

          • bluecheese4523d

            The obvious solution should be people stop using it and the system collapses. I haven’t had food delivered since 2020. Sadly it seems most people feel it is good enough.

        • gmac3d

          > They include some common appliances that use water, due to water use regulations. No reason my dishwasher should take over 2 hours to run

          I don't think this is a great example, because saving water (and thus the energy needed to heat the water) is both a social good and a private good.

          Your new dishwasher program might take longer because, for example, (a) it's more efficient to soak residues than keep blasting away at them, but it takes longer and (b) if you alternate between shooting water at the the top and bottom drawers (but not both at once) then you can get away with using half the water, in twice the time.

          Most dishwashers have an 'express' programme that uses more water and energy to finish faster, so if that matters you can still have it. If it doesn't matter to you (e.g. because you're running the dishwasher overnight, or while you're at work), you and everyone else benefits from the greater efficiency.

          So I think this is an unambiguous improvement. :)

          The average quality of appliances is a separate question. Anecdotally, I finally had to replace a 22-year-old Neff dishwasher. I got a new Bosch one (same firm, different logo), and have been pleasantly surprised that the new model is still made in Germany, seems pretty solid, washes well, and is guaranteed for 5 years.

        • mym19904d

          Not sure about price comparisons but what I can say is that many experiences feel worse. Paying 50-60$ for 3 tacos to be delivered, going out to basically any restaurant, pricing models on almost any subscription service(Adobe good example).

          It’s led me to learn to DIY as much as possible, making my own fun and experiences so to say.

          • jfengel4d

            The price of a delivery is going to be proportional to the distance, not the cost of the food. The delivery is the same overhead whether it's three tacos or a five course gourmet dinner for eight people.

            • johnnyanmac3d

              It really isn't these days. Ordering tacos down the street has the exact same extra costs as me ordering Burger King across town. Same "service fee", same delivery fee, driver isn't tipped more or less (just the expected % of your order).

            • mym19903d

              Eh, I technically pay 0 in delivery fees due to uber one or whatever but the fees somehow pile up anyways…the menu price of an item somehow doubles or triples I kid you not by the time I check out.

            • immibis2d

              Yet these companies add a few dollars to each item and then make shipping cheap - for presumably the same reasons Amazon does.

          • fijiaarone4d

            Why bother paying for tacos to be delivered when you can blend the soybean & canola oils yourself and stir in salt, msg, corn syrup, and food dyes yourself and drink it as an emulsified slurry that tastes the same as every restaurant?

        • marcosdumay4d

          > Am I totally wrong on this?

          You are probably getting more, and the difference more than goes entirely into rent.

          Real state is destroying the world's economy.

        • woah4d

          The money is going to the driver's rent.

        • nradov4d

          Donald Trump made appliance water use regulations an issue in the 2024 presidential campaign. Of course his opponents mocked him for it, and it probably was a little silly, but the messaging was effective in getting voters fired up about government overreach.

          https://www.msn.com/en-us/news/politics/biden-allies-call-tr...

        • immibis3d

          AFAIK the reason that newer dishwashers (new more than a decade ago) take a 2 hour cycle is that it's a more efficient cycle in both energy and water, but not in time.

      • arkh4d

        > Whenever people see old systems still in production (say things that are over 30 years old) the assumption is that management refused to fund the replacement.

        The problem is not refusing to fund replacements. The problem is refusing to fund maintenance.

        A lot of managers in old school business were sold on IT as a tool. And tools? You buy them, use them and replace only when they break. Maintenance is minimal and you sure don't evolve them.

        That's how you get couple decade old software chugging along, being so key to operations everything you want to add has to be aware of it and its warts which will then infect what touches it. And replacement projects cannot work because usually they mean changing how things are done.

        But 20 years of rot are a symbiosis between users and tools:

        - some tool does not allow a workflow, so users manage and find a workaround

        - there is a workaround so next version of the software landscape cannot break it

        - people want to do some new thing which is not in the software, changing it could break the previous workaround. So either people don't do the new thing or adapt and create other workarounds

        Multiple rounds of this and you have a fossilized organization and IT where nothing can be easily changed. The business cannot adapt. The software cannot be modified to allow adaptation because it could break the business. Now a new competitor emerges, the business is losing and that's when everyone starts blaming everyone for the problems. But in reality? The cause is 20 years ago when some management decided to add IT as a cost center.

        My solution to this problem? Create your own competitor and kill the old business.

      • ergonaught4d

        I’m sure that’s a factor, however we need to probably also acknowledge that “younger people” (whether developers or managers or etc) lack exposure to things that were genuinely better previously (and where technology is concerned there are many examples), and thus have no mental model for it. They literally don’t know any better, and they’re operating within that framework.

        Crude oversimplification: if all you’ve ever known are slow and bloated web app UIs on mobile phones, you’re simply not going to know how to make good design/development choices outside that environment.

        • andai4d

          I don't think that's the reason.

          If it were necessary to have seen something to know it's possible, nothing would ever improve, and nothing new would get made.

      • Cthulhu_4d

        > Go watch CGI in a movie theatre and it's worse than 20 years ago

        Objectively this isn't true as CGI technology has improved by leaps and bounds (think e.g. subsurface skin scattering in new vs old Gollum), however there's a lot of other factors at play; old CGI used film tricks to make it blend better, new CGI uses full CGI and digital whatsits and doesn't care anymore. It also depends on budget and what studio takes care of it. Good CGI is invisible, and there's a number of non-superhero films where the CGI just isn't visible / you're not even aware of it. Anyway, what 20 year old CGI are you thinking about, and what are you comparing it with? I'm thinking The Spirits Within (2001) or Beowulf (2007); the former did not age well, the latter was already panned as having poor CGI when it came out. Avatar (2009) pushed the frontier again I think.

        > go home to play video games and the new releases are all remasters of 20 year old games because no-one knows how to do anything any more.

        This is a blinkered view of reality; there's thousands of game developers outside of this bubble, from single person developers making modern classics like Stardew Valley or even Minecraft when it first came out, to teams of developers that are bigger than those that made the games of 20 years ago.

        Also, your opinion isn't fact; in the top 20 best selling games of 2024 [0] there is only one arguable remaster (GTA 5, which is on its 3rd remaster) and two complete remakes (FFVII Rebirth and CoD 3), with the former being a completely different game compared to the original. I share your cynicism about the "top of the line" video game market today, but you're not correct.

        (meanwhile I'm playing 2007 video game (Supreme Commander))

        [0] https://www.gamespot.com/gallery/2024s-best-selling-games-in...

        • ViktorRay3d

          Actually I think your examples show that it is you who may be incorrect.

          Stardew Valley is 9 years old.

          Minecraft is almost 16 years old. The current version of the game has not dramatically changed in terms of the experience of most players of the game in over 10 years. (Hardcore players of any game will always make a big deal of any minor changes).

          I was born in the 1990’s. I was playing games regularly in the 2000’s and the 2010’s although I don’t play as much today.

          Hardly anyone in 2005 was playing 1996 games or 1989 games regularly.

          Even in 2015 not many were playing 2006 or 1999 games regularly. (I think World of Warcraft was the only very popular old game in 2015)

          But now in 2025 you bring up a 2016 game and 2009 game to argue with that other guy?

          Hell what happened to the major big budget games? I remember playing Witcher 3, Red Dead Redemption 2 and Cyberpunk 2077…but even those games are ancient now. Witcher 3 is 10 years old, RDR 2 is 7 years old, Cyberpunk is 5 years old…

          In 2015 I was playing games more often but I was playing games that were more recently released…. Not really games from 2010, 2008 and 2005….

          Hell the most popular game for kids now is Fortnite which is 8 years old and came out in 2017! I wasn’t playing Mass Effect (2007) too much in 2015. The difference between Mass Effect 1 or Elder Scrolls Oblivion and The Witcher 3 is the same time difference as when Fortnite was released and 2025!

          • ttw443d

            Don't forget roblox as well. It's past the 20 year old point if you count development time. Although it is slightly different because the platform has changed substantially since 2006.

      • bluescrn3d

        > Go watch CGI in a movie theatre and it's worse than 20 years ago, go home to play video games and the new releases are all remasters of 20 year old games because no-one knows how to do anything any more.

        Even more glaring is TV shows, where you now get an 8-episode 'season' every 2-3 years rather than the old days of 20+ episode seasons every year, often non-stop for 5 or more years.

        It's not so much about capability/competence as pushing production values to unsustainable levels. You could get away with much less expensive VFX, sets, and costume when filming in standard definition. Now every pixel is expected to look flawless at 4K.

        Another more controversial factor is that everyone brings their politics/activism to work and injects them into everything that they do. Now everything has to be pushing for social change, nothing can just be entertainment for the sake of entertainment.

        • ngetchell3d

          Is that a change? George Lucas certainly brought his politics around Vietnam to Star Wars. The 70s were a very radical and political time for movies

          • havblue3d

            Saying that Stormtroopers are bad or that the Vietnam war was bad isn't really that controversial or partisan.

            • ozmodiar3d

              Maybe not now. I wasn't alive for the Vietnam war but I remember saying anything bad about the Iraq war 2 was a quick way to get fired and a bunch of death threats for a few years. Now things have flipped, but you've got to keep in mind that attitudes change and people like to pretend they never supported viewpoints that have become unpopular.

          • alabastervlog3d

            Star Trek (1960s), MASH (1970s)—shit, the entire history of sitcoms that are regarded as decent-or-better is mostly just one fairly "political" series after another, going back to the earliest days of TV. I mean FFS today people'd probably complain that the first Star Wars movie (not just ROTJ, to which you allude) is "woke" because Leia's the only consistently-competent character out of the three leads, and is by far the least-whiny. "Boo! Why is the elite-educated noble woman who's also already deeply involved in an armed resistance so much cleverer and more-effective and cool under pressure than our farmboy hick hero who's away from home for the first time and wallowing in a whole pile of recent trauma and grief, and this random flaky braggart scumbag they picked up?"

            Anyway, stuff like Dirty Harry or a bunch of traditional Westerns are extremely political in the same ways that "woke" movies are (presenting and normalizing certain roles and behaviors, presenting politicized views of history and of certain groups, ways of life, and attitudes, and using caricatures of their political opponents as bad guys), they're just not liberal so that means they "aren't political".

            Hell, most of the silent films that were good enough that anyone still gives a shit about them are plenty political, and often (but not always) rather liberal.

            • bluescrn3d

              > I mean FFS today people'd probably complain that the first Star Wars movie (not just ROTJ, to which you allude) is "woke" because Leia's the only consistently-competent character out of the three leads

              Compare+contrast with 'The Last Jedi'. Turning the male characters into total idiots and sending them off on a massive wild goose chase, before the day is saved by completely breaking the physics of the Star Wars universe, making all the previous heroes look like idiots for not using a relativistic kill vehicle against the Death Star!

              I don't remember hearing any complaining about strong female characters in the era of Leia, Ellen Ripley, Sarah Connor, Major Kira, Susan Ivanova, and so on.

          • dominicrose3d

            but Star wars didn't get a 1.8 rating on IMDB and the CGI was good for the time

          • bigstrat20033d

            What has changed is not so much that writers' work is influenced by their politics. As you said, that has always been the case. What has changed is that writers these days don't recognize that their work needs to be a good work of art first, and a way to express their views second. They lack any skill in subtlety or nuance, so the work becomes little more than a soapbox for the writer that is off-putting to all but the most ideologically aligned audience.

            I like to use Star Trek in the 90s as a good example of what I mean. While there are episodes where the writers got preachy (they're only human I suppose), most of the time the writers were very careful to not openly take sides on the issues they raised. Even if you got the sense that the writer for an episode might feel a certain way about the topic, the characters wouldn't tell the audience how to feel. They didn't call other characters who disagreed with them names. They didn't just bully their way to victory in the story. The topics were treated as complicated issues where reasonable adults could disagree.

            Compare that to shows/movies/books today. The writers treat the story primarily as a vehicle to express their opinions on issues. They have characters tell people "this is how a decent person behaves", with the understanding that the message is really meant for the audience. They have characters who agree with them call their opponents bigots or worse insults. They portray said opponents as villains or morons who only hold their beliefs because of how evil/stupid they are. They have the "good guys" run roughshod over anyone who disagrees with them, and they get to win despite their bad behavior. And often, the writers (and even other people involved like actors) will openly express their contempt for their audience when speaking about the work. They pick fights where none needed to happen, saying stuff like "if you don't like this then I don't want you as an audience member anyway". They are, in short, bad writers who don't have the skill to successfully let their social views influence their work.

            The result of all this is that these writers don't succeed at persuading anyone. In years past writers could actually make progress on advancing the things they believed in because they had the wisdom to not openly preach to people and call them names. They respected people enough to let them draw their own conclusions, and as a result were successful. But writers today aren't good enough to persuade people to continue breathing, let alone something more controversial than that.

            There is also an uptick in how much politics get forced into art, with people trying to claim "everything is political" and the like. But that isn't nearly as big a factor as how bad today's artists are at using political themes in their work.

        • BeFlatXIII3d

          > Even more glaring is TV shows, where you now get an 8-episode 'season' every 2-3 years rather than the old days of 20+ episode seasons every year, often non-stop for 5 or more years.

          That's often a good change. Less filler for the sake of having another full season.

      • pimlottc4d

        In many cases, quality is being driven down by automation that’s drastically cheaper and produces results that are deemed “good enough”.

        Some of this is inevitable as new products and services move from being high end to mass-market, and it’s perhaps a bit chicken-and-egg to determine whether we accept this because we most people never really cared about quality that much anyway or because we just learn to accept what we’re given.

        But it feels like there could be a world where automation still reduces costs while still maintaining a high level of quality, even if it’s not quite as cheap as it is now.

        • baazaa4d

          I once found some old price catalogues (early 20c) for shoes etc. and estimated the items there are barely any cheaper today in real terms. Now obviously that's partly because we have cheaper substitutes today, so we've lost economies of scale when building things the old-fashioned way and the modern equivalent has to be made bespoke... but it's still pretty alarming given we should be ~10x richer.

          But consider an example which can't be blamed on that. My city (Melbourne) has a big century-old tram network. The network used to cover the city, now it covers only the inner city because it hasn't ever been expanded. We can't expand it because it's too expensive. Why could we afford to cover the whole city a century ago when we were 10x poorer? With increasing density it should be even more affordable to build mass-transit.

          Obviously people blame the latter example on declining state capacity, but I'm not sure state capacity is doing any worse than Google capacity or General Electric capacity.

          • typewithrhythm4d

            Melbourne specifically is cooked by the rate of growth, and declining tax revenue per capita.

            When we funded the majority of the big infrastructure pushes our rate of growth was lower, and gdppc (and revenue/PC) was exploding. This generally ended with the start of big multicultural Australia policy in the late 60's.

            So in comparison, the amount of infrastructure we need to build is greater per capita, as it has to try to cover the future population predictions, it needs to be done over less years as well.

            Then we can get into the migration policy that's causing a decline in gdppc.

            • jimnotgym4d

              Since the beginning of Australia as a colony it has been: more people=more labour=more production capacity=more wealth.

              So did we just run out of useful things to do with people? Or did we concentrate the wealth away from the masses and blame the same immigration that created Australia in the first place?

              • typewithrhythm4d

                No, we really did not have a history like that remotely. It's a bit alarming to see a historical fact states so completely incorrectly.

                We had a hugely restrictive immigration policy, (have a look at the rate of growth over time) followed by multiple wars that meaningfully reduced the population... We were winning the Malthusian game, just by having lots of resources per person available.

                The policies you have probably heard called "white Australia" were more accurately understood as immigration restriction policy. If you read anything published at the time, there was only slightly less animosity for white english migration as the rest of the world. This was the era of communism and workers rights, and the workers absolutely understood that their labour was being devalued.

          • ViewTrick10024d

            I would say it is because of the Baumol effect.

            Construction is not massively more efficient today compared to a century ago, while salaries have massively increased.

            https://en.wikipedia.org/wiki/Baumol_effect

          • mitthrowaway24d

            I often wonder the same thing. My conclusion has been that automobile infrastructure swallowed the budget.

            • baazaa4d

              Definitely this is what was happening mid-century (when indeed everyone else was ripping out their tram networks entirely).

              But I think if you look at modern light-rail projects there really has been such insane cost-inflation it wouldn't be worth covering the city with trams even with a much bigger budget. Also because such a large fraction of the price is admin etc., it creates a bias towards more expensive infra (heavy rail) because the paperwork overhead is similar either way so you get more bang for your buck.

          • andai4d

            What declined is will, which is a function of testosterone (down about 50% since 1970).

            • disgruntledphd23d

              > What declined is will, which is a function of testosterone

              That's a really interesting claim. Do you have any sources that explain this further?

              • andai2d

                Thanks. I spent a few minutes reflecting on this and googling papers.

                Turns out that "will" is a vague concept and doesn't have great neurological or animal models.

                However, we can use some reasonable proxies!

                I would argue that "libido" is the most obvious one. I recently heard a multimillionaire admit (with some embarrassment) that "we really do all this to get girls."

                ( I assume "libido is a function of testosterone" requires no citation ;)

                Testosterone directly affects dopamine levels, dopamine sensitivity, and willingness to engage in competitive behavior:

                https://www.edenclinic.co.uk/post/testosterone-and-the-brain

                Another factor is "goal-directed behavior", which is mediated indirectly by "increased sense of agency"

                > these results further imply that through an embodied SoA, testosterone can ultimately modulate higher-order experiences of social power and goal-directed behaviour.

                https://www.semanticscholar.org/paper/The-Effect-of-Testoste...

                At the societal level there is a fascinating (and deeply disturbing) book by J. D. Unwin, who studied thousands of civilizations:

                >The book concluded with the theory that as societies develop, they become more sexually liberal, accelerating the social entropy of the society, thereby diminishing its "creative" and "expansive" energy.

                https://en.wikipedia.org/wiki/Sex_and_Culture

                Notably, conscientiousness and executive function are not enhanced by testosterone. However, deficiency is associated with fatigue, depression, brain fog etc. So it supports "will" by supporting overall health, and a population-wide ~50% decline does not sound healthy to me.

      • plondon5144d

        In Japan right now and I see a ton of automation everywhere, self checkout at grocery stores and restaurants, but what you also see is a live humanbeing assigned to the machines to help you if you have issues.

        • throwaway1504d

          > In Japan right now and I see a ton of automation everywhere, self checkout at grocery stores and restaurants, but what you also see is a live humanbeing assigned to the machines to help you if you have issues.

          Isn't that how self checkout happens in every part of the world that has self checkout? I'm failing to see what's special about self checkout in Japan.

          • klausa4d

            Anecdotally, the staff:machines ratio is much lower here than I've seen in Europe or US, and they also just pay more attention, and are more proactive (you'll get staff at Muji making sure that you're aware that you can't check out tax-free).

            The staff will also instantly materialize if they are needed to confirm you can buy alcohol, or there is some kind of problem; which is also not my experience elsewhere.

            It's not a worldshattering difference, but it is noticeable.

            • trollbridge3d

              In US self checkouts the grumpy staff member is mostly there to stop people from stealing and acts annoyed when you try to get them to help you when the machine inevitably doesn’t work.

            • plondon5144d

              Exactly, In my local pharmacy in NYC there is no attendant specifically for the self-checkouts, it's usually someone floating around or the same person doing manual checkout. In Japan the person is very present, willing to help, and says thank you very much, which also prompts me to say thank you in return.

              • galaxyLogic4d

                But in Trader Joe employees are curiously happy, and it seems genuine. I wonder why that is? Some other grocery stores nearby employees are grumpy. Why?

                • Workaccount23d

                  Trader Joe's has narrow selection, leverages economies of scale for relabeling, does it's own distribution, and forgoes any form of ready made/ready prepared foods (instead leaning on a large frozen meal selection).

                  I love trader joes, don't get me wrong, but I wouldn't be happy if it was the only grocery store I had access to. For me it's an awesome second-in-line grocery store, more like a specialty grocery than a main grocery.

                • lotsofpulp4d

                  My theory is the mid scale full service grocery stores with huge footprints and multiple specialized departments is on the way out.

                  They are stuck with expensive legacy union employee contracts, while smaller and more efficient operations like TJs and Aldi and Lidl and Costco and Walmart and Winco eat all the consistent low margin sales, leaving the big grocery stores with only volatile high margin sales.

                  People per household has been trending down for a long time, which also impacts the amount/variety of cooking.

                  • dragonwriter4d

                    > People per household has been trending down for a long time

                    It was, but 2021-2024 was the first three year period that didn't end lower than it started since the 1960s (starting and ending at 2.51 average per household); it is possible that trend has arrested.

          • OJFord4d

            Ah, how much you (and we in the west) have to learn! This is the ancient Japanese art of serufuchekkuauto de no ningen no sapōto.

        • nicbou4d

          Japan has a LOT of make-work jobs for the elderly.

          • yoyohello134d

            Isn’t that good?

            • boringg4d

              absolutely - far better than any other model I've seen.

              • jimnotgym4d

                Far better than UBI?

                • boringg3d

                  Definitely. UBI doesn't give you any connection to the world or meaning just money.

                • johnnyanmac3d

                  Maybe GP hasn't seen UBI at scale. I sure haven't. Maybe one day.

      • giantg24d

        To be fair, the replacement projects we've outsourced to multiple Indian companies have been utter failures too.

      • carlmr4d

        >But if you look at replacement projects so many of them are such dismal failures

        The problem with replacement projects is when and why they're usually started. They're usually started once there's a fixed deadline on some technology ceasing to exist, creating the appropriate urgency.

        Usually the people that wrote that original software have long gone, the last few people that were able to maintain it are also nearing retirement age or already gone as well, you have some ancient technologies used for which it's hard to get documentation on the internet today.

        Now you're tasked with writing a replacement, and everything that doesn't work on day 1 is deemed a failure. It might have worked if you started earlier. Because if your original codebase is COBOL and assembly written for mainframe, it's really hard to find anyone that can understand what it does fully and rewrite it now cleanly.

        If you had updated from COBOL and mainframe assembly to C, and from C to 90s Java, and from 90s Java to modern Java/Go/Rust/Node, you'd have plenty of institutional knowledge available at each step, and you would have people that know the old and the new world at each step. Jumping half a century in computing techonology is harder than doing small jumps every 10-15 years.

      • chii3d

        > Go watch CGI in a movie theatre and it's worse than 20 years ago, go home to play video games and the new releases are all remasters of 20 year old games because no-one knows how to do anything any more.

        None of the things you said are actually true. Only superficially, because you've only seen those mass market crap.

        Good movies are still around, and yuo don't even notice the CGI, because they're cleverly done. For crap like the recently released snow white, it's obvious that the CGI is badly done - it doesn't make it an indictment against all movies released of late!

        Same with games - just because there's lots of AAA studio flops that look terrible, doesn't mean the medium is all terrible. There's so many good indie games that you can never truly play them all.

        But if your exposure to these products are only the mass market crap, then you might certainly feel that way.

        • snozolli3d

          Good movies are still around

          Compare 1997 to today.

          Major hit after major hit was being released that year, and they were overwhelmingly original and creative. There had been a boom in independent filmmaking and many of the big production houses had started up smaller studios to attract the talent. Unfortunately, Hollywood did what Hollywood does and killed everything that made them good.

          Nowadays, we have endless releases of super hero sequels that are, fundamentally, the same movie over and over. We have endless remakes and reboots because nobody wants to take a chance.

          Yes, you can find creativity if you look hard enough, but in 1997 it was everywhere, and in your face. You can't pretend that it doesn't matter or that it doesn't mark an enormous shift in culture (business and society).

          • pcthrowaway3d

            Yeah, but compare television of today to that of 1997. I think the real difference is that priorities shifted in the industry with streaming. Serializing a television show weekly at a regular slot, multi-episode narratives were more difficult to follow (viewers would be out of luck if they missed an episode, and they couldn't just stream it later)

            So there has been something of a renaissance with television, starting around the time of the Sopranos release in 1999 I think, which there was a market for shows which didn't 'reset' somewhat between episodes.

            • bitwize3d

              Television today consists of a few bangers like Severance mixed with an awful lot of slop. Serialized or not, it's telling that the term Netflix coined for most of its output is "second-screen content", meaning stuff you have on in the background while you scroll your phone. I had been rewatching Max Headroom recently, and the difference it makes when you make a television show to be watched -- as most television was in the 80s and 90s -- vs. simply filling time, really struck me.

            • snozolli2d

              compare television of today to that of 1997.

              Uh. 1997 had Oz, Buffy the Vampire Slayer, Stargate SG-1, King of the Hill, Just Shoot Me, Ally McBeal, The X-Files, Friends, 3rd Rock from the Sun, and MTV still showed music videos. Cable television hadn't yet been completely overrun with 'reality' television. We joked about The History Channel becoming the WWII channel, but it hadn't yet become the Ancient Aliens, cheap, pseudo-reality parody of itself.

              I get your point about serialized stories, but I'd still take the great entertainment of the 90s over today's over-reliance on digital effects and low-quality writing to generate cheap drama. Besides, most shows aren't written with a set arc, they just keep writing more so long as the numbers stay up. So we get a couple of seasons of increasing drama and mystery, then it gets cancelled with no payoff. I'd rather have the amnesia-based reset system than that!

          • bitwize3d

            What's more, the biggest hit of 1997 swept the Oscars the following year. I'm not the biggest Titanic fan in the world, but it gave moviegoing masses what they craved: spectacle and pathos. And it was considered the best film of the year just for doing that with aplomb. It was very Old Hollywood in a sense. These days, "blockbusters" are thought of as just expensive-to-produce slop for the lumpenproletariat, so the studios treat them as such. No one wants to produce a blockbuster with the sort of care and attention to detail that went into, say, Star Wars Episode IV: A New Hope. Or even Titanic.

            What's worse is that comedy is a minefield, as somebody somewhere is bound to be offended and launch a cancel campaign. So comedy films, including the once-beloved rom com, just don't get produced anymore like they used to. Any attempts at humor in movies has to be rolled in to something else -- superheroes talking in aggressively annoying Whedonese and the like -- and housewives must content themselves with Hallmark Channel glurge. And what humor is there is cringey as fuck because it's either entirely toothless or it's a "Straight white men, am I right?" type of thing because you are still allowed -- and encouraged -- to mock that group.

            I mean, the normally sequel-averse Jim Carrey came back to do three movies about a video game hedgehog because those are the only movies being made in which he gets to flat-out do Jim Carrey stuff.

      • corimaith3d

        At least in terms of media, profits are higher (due to increasing mainstream acceptance rather than opionated subcultures), budgets are also higher (hence more risk), so why put in the effort? It's not the West, same thing is happening in Japan, even in places like mass produced literature and Webtoons in China & Korea. Gacha games/Live-Service unfortunately is alot more profitable for less effort than an ambitious single player game like BG3. Then there's also poignant quote by the Square Enix CEO that any sort of investment into media needs to be compared to with the opportunity cost of just investing in the S&P500 instead. It's not enough to just be profitable, you need to make at least 2x/3x over that 5 year dev time to break even.

        So alot unfortunately is a choice that consumers have made. Even in terms of media again, alot of modern viewers watch media more as self-insert fantasies, so quality of writing or novelty is often going to worthless or even detrimental to them. I don't believe that mindset, but having talked to many on /a/, /v/ or reddit, there's many who are just there to consume rather than actual interest.

        • johnnyanmac3d

          It's hard comparing GaaS to Single player ganes. It's not less effort, it's different effort.

          Your average gacha may look lower effort, but it has to sustain thst effort longer instead of patching the game for a few months and moving on. It has to do a lot more marketing to get players in, because many are this pseudo-MMO experience, completely with PvP and Guild content to manage.

          At the highest end, Hoyovervese's operating costs would even make Activision blush. But those games make billions to compensate.

          • corimaith3d

            Well in the same vein, we can directly compare the decline of fully fledged MMORPGs to the psuedo-mmos that are Gacha games with essentially all the ambitious parts of the mmo that is stripped off.

            That very much is an indictment of ambition and progress here.

      • SkipperCat4d

        I feel that the West is backsliding because for the past decade, we've addicted ourselves to social media dopamine hits and we stopped observing the outside world because we've glued our attention to our phones. Seems like this has hit the under 30 group the hardest.

        I remember being bored and having to create my own fun. I remember being aware of my surroundings and being curious about it because I didn't have my favorite entertainment media attached to my palm. I remember learning about thing such as what was in my Cheerios because the box was the only thing in front of me when I ate my breakfast.

        It would be a joke to say that AI exists to fill the void from what I mentioned above, but it does kinda sorta feel correct in a weird sci-fi conspiracy way.

        • gtirloni4d

          Good thing there aren't phones and social media in the East (or whatever you want to call non-West these days).

        • meander_water4d

          This is largely the conclusion drawn in Stolen Focus by Johann Hari (well worth the read). Although he argues not just in the West, but across the world.

        • fijiaarone4d

          People used to get dopamine hits from writing code that works, fixing cars, climbing mountains, playing music, and asking other people out on dates.

          Dopamine addiction isn’t the problem.

          • jfengel4d

            We've had TV for decades. Postman wrote "Amusing Ourselves to Death" in the 80s.

            We've always been addicted to dopamine and we were always getting it in the easiest way possible.

            • kevin_thibedeau3d

              TV wasn't as engrossing when there were only three channels. Fewer people would be full time couch potatoes. Cable's growth in the 80s changed how society allocated their leisure time on passive activities.

          • SkipperCat4d

            True, but what you mentioned involves active participation, creation and social interaction. Sitting around scrolling on TikTok feels more like empty calories.

          • jobs_throwaway3d

            > People used to get dopamine hits from writing code that works, fixing cars, climbing mountains, playing music, and asking other people out on dates.

            Those all sound vastly more positive both on a personal and societal level vs getting dopamine from your phone

      • antifa3d

        "Don't fix what isn't broken" is often becoming a survival tactic these days. You never know what new modern puchase you could make is spying on you, only works online (when no useful reason for that), made of cheaper materials, planning to remotely disable a feature and charge you for it, vendor locks resupply/maintenance, etc..

        It increasingly applies to nearly all aspects of the economy. Everybody wants to lock you in and take a cut. Almost all new innovation these days is just rent seeking gatekeeping. Even genuine innovations are unable to get their innovations out without either recreating entire software stacks (or supply chains) that's under feudalistic/parasitic control, they often remain niche and undermonetized. This will have an effect on the economy like a % yearly reduction in atmospheric oxygen will destroy biodiversity.

      • jose-incandenza3d

        I believe it's not related to capability, but to how investment works. Aggregate demand is composed of consumption and investment (I'm referring to the global economy, combining both public and private expenditure). Investment is the money extracted by capitalist actors in the system that is reinvested to generate profits. These capitalist actors need an incentive (the promise of generating more money) to invest the money they have extracted, and this incentive is usually the latest hot technology.

        For example, when the internet emerged, everyone wanted to be online; when smartphones appeared, everyone wanted to have an app; and when VR emerged, Facebook changed its name and lost half of its value in the stock market. Now, it's AI. Capitalists do not focus on the details of the technologies; to them, every new technology looks the same. They see new tech as a growing opportunity and old tech as a saturated market. Obviously, this perspective is flawed, but it doesn’t matter.

        In my opinion, AI is not going to create more value. The only real impact it will have is reducing the amount of workforce needed to generate that value, which will ultimately push the economy into a recession. As consumption declines, I don’t see what new technology could come after AI to offset this effect through further investment.

      • bloqs4d

        What's happening with the games and movies is investment has figured out through the decades is that taking big risks often does not pay off, and that 25 to 45 year olds are the biggest consumer group.

        Broad changes in the distribution of wealth, and government spending on education sharply declining, levels of critical thinking and open-mindedness have declined.

        So now, if something can be made thats part of an existing franchise or consumer favoured products, then thats lower risk. It attracts more capital. Full on remakes again and again with idiots generally accepting bad games on nostalgia value means sales even of a bad game remain palletable.

        I dont think the west is going backwards in capability, but people seem incapable of highlighting what has changed

      • anonzzzies3d

        I think in the west all c-level /managers (but it happens on all levels, including workers, they just tend to not get very far) just want to get rich (and/or power; same thing) and have 0 pride or vision: they just pick the path that makes them most; if that's good for the company/country/etc that's a nice coincidence, but if not, fine. Enough examples around and they are not even ashamed or whatever that it costs lives; as long as they get the $ it is fine. There is no vision, no real plan beyond what they think they will stay on for.

      • bee_rider4d

        I don’t think it is a good standard for judging a civilization, really. But CGI, 20 years ago? A lot of it was really quite bad. CGI has always had bad and good instances because of the interplay between increasing technical skill and totally random director-determined skill at selecting shots.

        I mean, like, Disney has been getting worse at CGI, but only because then whole company has given up. This is just normal companies shifting around, though.

        • tbyehl4d

          Anything from 20+ years ago that someone still thinks holds up as great CGI probably wasn't CGI in the first place.

          The problem with CGI today is that it's over-used and mis-applied in areas that still have Uncanny Valley type issues (fight scenes, car chases/crashes, etc).

        • johnnyanmac3d

          >I mean, like, Disney has been getting worse at CGI, but only because then whole company has given up.

          I think that's the main point, yes. There's a sense before that companies were trying to push the envelope. These days it's just a shrug and cynical minmaxing of funds to the shareholders. CGI 20 years ago was objectively worse but you can tell they had way to hide the flaws or redirect the eye away from them. Now... Ehh, who cares? Just get the first pass through.

          If you want a relevant example: some people say Lili and Stitch's life action has a weird looking stitch model. Part of thst is because way back in 2005, the original Stitch was simply never meant to be looked at in a side profile for an extended time. Art directors made sure to avoid that angle in every frame they drew. 20 years later... meh. Ship it. Screw the outsourced CGI trying to model something better, the cinematography begin careful of angles, nor any reaction from "nitpickers". We got the IP, it'll make money.

          It's not a franchise killer but it'd just one example of the many broken windows

          • bee_rider3d

            I don’t completely disagree, but I think it is, at least, going to require a lot of work to generalize from “American large corporations” to “the West” (which has always been a fuzzy concept, but at least includes Europe, which seems to be getting better over time).

        • 0x1ceb00da4d

          CGI today is so good that you don't even notice it most of the time.

        • jfengel4d

          The Pixar division, at least, is extraordinary and continues to push for more.

      • KaiserPro4d

        > Go watch CGI in a movie theatre and it's worse than 20 years

        Sorry but thats just not true. Sure there are shit VFX films, but I guarantee that the "serious" movies that people hold up as "all in camera effects" have hundreds of shots with digital set extensions and all sorts of VFX magic.

        If you look at TV, where there has been huge competition, the use of VFX has exploded, mainly as a cost saving, but also as a story enhancer. stuff that would have cost £20m ten years ago is being done for £500k. Thats huge innovation.

        > remasters of 20 year old games because no-one knows how to do anything any more

        They are remasters because the people putting the money up are conservative.

        Innovation is happening, just not where you expect. Look at the indy games market.

        Much as I don't like it, but a huge amount of innovation is happening in the world of youtube and tiktok. New editing styles, almost a complete new genre of moving picture has emerged.

        Where there is competition, there is innovation.

      • mjevans4d

        I suspect a key to success is:

        Don't replace an existing solution with exactly the same thing on a different platform.

        Think larger. Solve today's / (near) tomorrow's problem's BETTER. That's probably going to require changes to process too. A full evaluation of what's the most effective way with the capabilities and needs that exist now.

        Then bring up interfaces that provide what the old system did, verify the data round trips, and when it's approved cut over.

      • hackernoops4d

        Decline was a set of choices, it's just that the average person didn't make those choices and we're not allowed to discuss them or the people that made them.

        • galaxyLogic4d

          Someone has to shout out: Empteror has no clothes!

          In recent years lying has been normalized. Black is White etc. 1984 is here.

      • exe343d

        > if you look at replacement projects so many of them are such dismal failures that's management's reluctance to engage in fixing stuff is understandable

        This makes it sound as if management only decide whether to engage in modernizing or not. I think it's only fair to also give them full credit for the failures - profit over people, dogma over pragmatism, etc is their fault.

      • robertlagrant3d

        > go home to play video games and the new releases are all remasters of 20 year old games because no-one knows how to do anything any more

        People do, but they aren't in AAA studios. They're doing indie games, because their large corporations were captured by the professional glom-onto-success management class.

      • dgfitz4d

        > releases are all remasters of 20 year old games

        This just isn’t true. It’s not nice to make things up.

        • bigstrat20033d

          It's far less nice to accuse people of lying without any evidence. If someone is wrong, say so. But there's no reason to call people liars.

          • dgfitz3d

            Elden Ring by _itself_ destroys the whole point.

            If someone is wrong, say so? I said "this just isn't true"

            That doesn't count?

        • theshackleford4d

          Yup. It’s just nostalgia huffing and increasing age.

      • whoknowsidont4d

        >But if you look at replacement projects so many of them are such dismal failures that's management's reluctance to engage in fixing stuff is understandable.

        This is most definitely still the fault of management.

      • nindalf3d

        > go home to play video games and the new releases are all remasters of 20 year old games

        This annoyed me, because it's so manifestly untrue. The games of the year of the last few years (https://en.wikipedia.org/wiki/List_of_Game_of_the_Year_award...)

        - 2024: Astro Bot

        - 2023: Baldur's Gate 3

        - 2022: Elden Ring

        - 2021: No consensus pick, but It Takes Two stands out to me

        - 2020: Hades

        All of these, with the exception of BG3 are original IP. A lot of them have really unique game mechanics that I haven't seen before. Hades has some of the tightest combat that never gets old even after hundreds of runs. It also has extraordinary music and voice acting. Truly a labour of love.

        It Takes Two is a co-op story adventure. Every single level has a new fun mechanic. In one of them you literally control time. Please, do tell me which game from 20 years ago was a co-op adventure where every level was unique? The best co-op was probably Halo 2 (2004), but that's just shooting from beginning to end.

        You're thinking "well, ok there's one sequel in there. That's proof that video game companies want to play it safe". But you'd still be incorrect. BG3 is inspired by its prequels BG1 and 2, but those released 20 years ago. Open YouTube and check out how different they are in every single way. I'll bet there isn't even a single line of code common between the BG3 and the originals. BG3 exists because the developers grew up playing BG1 and 2 and wanted to make a homage to the games that shaped them. And they succeeded, good for them.

        I will admit that I didn't play Elden Ring. I didn't even attempt to, because I already have a full time job. But that's great too, because it shows that there are games being made for people who love a punishingly difficult challenge. That's not me, but you can find that now if you want.

        Your comment is just rose-tinted whingeing. It's so easy to write a comment like "man, the good old days were really good weren't they". But ... no. I can play all of the games from the good old days and I can also play Hades, It Takes Two and BG3. And that's just the surface! There are so many incredible games being made and released. Factorio is great in many ways, but the most remarkable part is how they've optimised their game to a mind-boggling extent.

        No one knows how to do anything anymore? Then how did these incredibly innovative, flawlessly executed games get made?

        • ViktorRay3d

          I posted a reply to another user here.

          https://news.ycombinator.com/item?id=43493740

          I don’t want to retype everything I posted in that reply but it kind of applies to your comment as well.

          In 2015 if we were having this discussion I could easily pull out dozens of groundbreaking innovating games from 2010 to 2015.

          In 2005 if we were having this discussion I could have easily pulled out dozens of groundbreaking innovating games from 2000 to 2005.

          But we are having this discussion in 2025 and I know both you and I would struggle to pull out a dozen high quality new innovating games that have come out in the past 5 years.

          Clearly things have gone worse.

          • nindalf3d

            It's unclear what you want, exactly. Look at the list of Games of the Year I shared. There are dozens of incredible games. Have you played the games I mentioned - BG3, Hades and It Takes Two? Hit games can be similar to previous hits. That's not a bad thing though. That's just successful innovations spreading.

            What I suspect is the problem is that you also want them to be groundbreaking and innovative. This is an impossibly high bar to meet in a mature industry. There are some games that still meet this bar. Half Life Alyx is from 2020, ever played anything like it? Have you truly built all the possible contraptions in Tears of the Kingdom (2023)? Last of Us Part II (2020) is going to premiere on TV in a couple of weeks. How many older video games have a story that was shot so perfectly that they be translated shot for shot into a hit TV or movie?

            Check out these two videos of a guy horsing around in Tears of the Kingdom - https://www.youtube.com/watch?v=lpFXlkjAurc and https://www.youtube.com/watch?v=VyQdn5bwF_Q. Look at how much fun he's having! Yes, it takes an incredible game to enable such creativity. But he's having fun because he wants to have fun.

            If you're finding less joy in games than you used to, you should be open to the idea that it's not the games that are causing that effect.

          • Workaccount23d

            I don't know about you, but as I have grown older video games in general have just become less appealing. However I can see the same glee in younger coworkers eyes when talking about games today.

            In 2005 I could play a game for 12 hours straight and then hardly be able to sleep I would be so excited about playing it the next day.

            Today, even for a game like BG3 that is objectively an incredible game, I can do maybe 2 hours every few days and feel fulfilled.

            I don't think this an outlying example either. Most of my friends are now the same way, and frankly when you login to play games online, it's not exactly overflowing with the 35-40yr olds who saturated servers 20 years ago.

            • tarentel3d

              I think that's just life. I felt the same way in 2005 as you did. By 2015 when this person is talking about I had other life priorities and stopped playing games as much. I still play games from time to time and even get addicted here and there but, at least for me, I didn't stop playing because the quality of games went down. I can't really say they've gone up either though. If anything people have made about the same amount of great games, and bad ones, each year for a while now.

        • nottorp3d

          > All of these, with the exception of BG3 are original IP.

          Elden Ring is just Demon's Souls 4 from 2009. It's good to the point that I'll still preorder its successor, but nothing is original there any more.

          Edit: not 4, more like 7?

          Edit 2: Hades seemed more difficult to me than Elden Ring. Maybe you shouldn't trust the marketing and check for yourself.

          • nindalf3d

            Nah, I'm good. I've seen people play Elden Ring and it's definitely not for me. I don't even like the art style.

      • Telemakhos3d

        > Go watch CGI in a movie theatre and it's worse than 20 years ago, go home to play video games and the new releases are all remasters of 20 year old games because no-one knows how to do anything any more. And these are industries

        Maybe arts shouldn't have been industries. Look at sculpture or painting from the Renaissance and then postmodern sculpture and painting and you'll see a similar decline, despite the improvement of tools. We still have those techniques, and occasionally someone will produce a beautiful work as satire. We could be CNC milling stone buildings more beautiful and detailed than any palace or cathedral and that would last for generations, but brutalism killed the desire to do so, despite the technology and skill being available. There's something to industrialized/democratized art being sold to the masses that leads to a decline in quality, and it's not "because no-one knows how to do anything any more." It's because no one care nor wants to pay for anything beautiful, when there are cheaper yet sufficient alternatives.

      • WillieCubed4d

        I don't think the issue is that the West is going backwards in capability; rather, it's that although it has the capability to produce great products (software, media, etc.), it deliberately chooses not to because it's not as cost effective, because the people with expertise are overworked and understaffed, or because management had other priorities (see AAA game development).

        In other words, the capitalists won.

        • baazaa4d

          AAA games are eye-wateringly expensive though, management aren't imagining it; my point is things becoming more expensive is a symptom of decline. I'm sure the late romans consoled themselves they could build another Pantheon they just cared more about efficiency now.

          Where I work in government we've stopped paying for important data from vendors (think sensors around traffic etc.) because the quotes are eye-wateringly expensive. But I've worked in data long enough to know the quotes probably reflect genuine costs, because data engineers are so incompetent (and if it's a form of pricing gouging it's not working because gov isn't paying up). So it looks like we're choosing to be in the dark about important data, but it's not entirely a choice.

          Saying we can do stuff but it's unaffordable is imo just another way of saying we can't do stuff.

          • johnnyanmac3d

            They are, but they don't have to be. This is definitely an example of the West faltering where the East is really flourishing. They aren't trying to make the next Fortnite nor GTA 6 or whatever billion dollar day one hit. They pick a more modest scope and budget, reuse assets smartly, and get reliable releases out.

        • QuadmasterXLII4d

          that was it even 6 or seven years ago, but the warning signs are there that we’ve chosen not to for so long that the abilities have rotted

        • fijiaarone4d

          Who did the capitalists beat? America wasn’t a socialist utopia 20 years — or even 50 years ago.

      • throwaway67344d

        The result of boomer cultural and capital domination. Millenials need to grow up and yank power from them, first starting with massive government handouts given to olds

        • nradov4d

          Should I take it that you support privatizing Social Security and eliminating Medicare? That probably won't be a winning political platform with any generation.

          • throwaway67344d

            As a first step I'd prefer to see both of them dramatically more means tested. There's no reason that someone with millions of dollars of assets should be receiving thousands of dollars a month from the state. I think this position is politically tenable with the looming cuts that will happen automatically over the next decade

            • trollbridge3d

              Means testing SS would result in elderly people transferring their assets to other people, much as already happens so they can get free nursing home care. Medicare isn’t means treating but Medicaid is, so we can see exactly what happens.

              Means testing will also make paying into these programs even less popular. Upper middle class people will ask why, exactly, they’re expected to pay more into a retirement program they will get less out of. That’s a recipe for political change from a party who promises not to do that.

            • nradov3d

              That's not going to happen so you can forget about it. Means testing entitlement programs has been tried before and never got much political support from any generation or political party. It's unwise to create incentives for old people to be poor because then workers will be less likely to also save for retirement in their own separate private accounts.

              • johnnyanmac3d

                How is that an incentive? It's a fallback. You're not gonna love comfortably off SS payments alone in this day and age.

                Treating this mentality of "taking money out that you put in" as "taking handouts" is the exact reductive mentality being used to try and have the government steal the money you earned from under your nose.

        • hackernoops4d

          Even worse is the entitlement spending imbalance along racial lines.

          • somename94d

            Yes, only Whites and Asians are net taxpayers over the course of their lifetimes. All others, on average, receive more government benefits than they pay in taxes.

            • lotsofpulp4d

              Land owners and old people are single biggest recipients of government benefits. Specifically, Social Security and Medicare, and the legal/police apparatus for ensuring protection of property.

              Also, traveling and routing utilities and police and ambulances and all of society around a larger plot of land costs (in time and energy and materials) at least a power of 2 more than a smaller plot of land.

              Not only are there are no marginal tax rates for land value tax, but there are tax breaks for elderly, caps on tax increases the longer the land is owned, and tax deferrals (such as 1031 exchange).

              I will leave it to the reader to figure out which “tribe” is most represented amongst land owners. And old people receiving Social Security and Medicare.

              • ogogmad3d

                Your use of the word "tribe" suggests you mean Jewish people, whether or not you realise that.

                [EDIT] Understood, thanks. It's more like some people could mistake the intention.

                • lotsofpulp3d

                  I find tribe to be a more accurate word than “race”, which is some ever changing confluence of skin tone, ancestry, and other difficult to pin down characteristics.

      • kaycey20224d

        The west isn’t capitalist any more. Sure the number goes up but at what cost

        • sloowm3d

          This is the absolute opposite take of what is actually happening. We are in the most capitalist time ever. There are a few people who own all the wealth, 2 of them are also in control of the most powerful government. This is capitalism baby, enjoy the full wrath.

      • trashtester3d

        With all due respect, this attitude typically comes with age. I see it in myself, too (I'm over 50).

        You're right that an important reason why it's hard to replace those 30+ year old systems, and that part of the reason is that the current devs are not necessarily at the same level as those who built the original. But at least in part, this is due to survivorship bias.

        Plenty of the systems that were built 30-50 years ago HAVE been shut down, and those that were not tend to be the most useful ones.

        A more important tell, though, is that you see traditional IT systems as the measuring stick for progress. If you do a review of history, you'll see that what is seen as the measuring stick changes over time.

        For instance, in the 50's and 60's, the speed of cars and airplanes was a key measuring sticks. Today, we don't even HAVE planes in operation that match the SR-71 or Concorde, and car improvements are more incremental and practical than spectacular.

        In the 70s and into the 80s, space exploration and flying cars had the role. We still don't have flying cars, and very little happened in space from 1985 until Elon (who grew up in that era) resumed it, based on his dream of going to Mars.

        In the 90s, as Gen-X'ers (who had been growing up with C64/Amiga's) grew up, computers (PC) were the rage. But over the last 20 years little has happened with the hardware (and traditional software) except that the number of cores/socket has been going up.

        In the 2000s, mobile phones were the New Thing, alongside apps like social media, uber, etc. Since 2015, that has been pretty slow, too, though.

        Every generations tends to devalue the breakthroughs that came after they turned 30.

        Boomers were not impressed by computers. Many loved their cars, but remained nostalgic about the old ones.

        X-ers would often stay with PC's as the milennials switched to phones-only. Some X-ers may still be a bit disappointed that there's no flying cars, Moon Base and no Mars Colony yet (though Elon, an X'er is working on those).

        And now, some Milennials do not seem to realize that we're in the middle of the greatest revolution in human history (or pre-history for that matter).

        And developers (both X'ers and millennials) in particular seem to resist it more than most. They want to keep their dependable von Neumann architecture computing paradigm. The skills they have been building up over their career. The source of their pride and their dignity.

        They don't WANT AI to be the next paradigm. Instead, they want THEIR paradigm to improve even further. They hold on to it as long as they can get away with it. They downplay of revolutionary it is.

        The fact, though, is that every kid today walks around with R2D2 and C3PO in their pockets. And production of physical robots have gone exponential, too. A few more years at this rate, and it will be everywhere.

        Walking around today, 2025 isn't all that different from 2015. But 2035 may well be as different from 2025 as 2025 is to 1925.

        And you say the West is declining?

        Well, for Europe (including Russia), this is true. Apart from DeepMind (London), very little happens in Europe now.

        Also, China is a competitor now. But so was the USSR a couple of generations ago, especially with Sputnik.

        The US is still in the leadership position, though, if only barely. China is catching up, but they're still behind in many areas.

        Just like with Sputnik, the US may need to pull itself together to maintain the lead.

        But if you think all development has ended, you're like a boomer in 2010, using planes and cars as the measuring stick that thinks that nothing significant happened since 1985.

      • anovikov3d

        Look at space and it was obviously broken 20 years ago in the West: Shuttle that finally proved itself, after 20 years of trying to conceal it, to be a generational, unfixable mistake - and stubborn insistence on pushing through with a replacement based on same technology and same people - burning billions while staying stuck.

        Now it is in the best shape ever and progress seems to be unstoppable. And West throughly dominates it in every dimension and that dominance seems to only be accelerating.

        Boeing just failed in what was an inherently unfair game: they tried to compete with state-funded Airbus that could just burn unlimited cash not worrying about real profitability, Boeing tried doing it by cutting costs, and failed.

    • nickff4d

      De-scoping is also a commonly-cited result of higher interest rates. We recently lived through a prolonged episode of zero-interest-rate-policy (ZIRP), which encouraged long-term and risky projects. When interest rates go up, the minimum acceptable return-on-investment (ROI) required to lure investment money away from low-risk investments like government bonds also increases correspondingly.

      • echelon4d

        I'd be willing to bet that the biggest reason is that there hasn't been any antitrust action against the big tech companies. They just sit at the top, siphoning value from every other market in the world. If you need to use the internet in any way, FAANG taxes you.

        None of these big tech companies really need to grow bigger. The smartphone is essentially done. AWS just prints money. Social/consumer apps are "done". What more is for them to do but collect rent?

        The US government needs to break them all up. That'll oxygenate the entire tech sector, unlock value for investors, and kickstart the playing field for startups.

        Google, Apple, Amazon, Meta, and maybe Microsoft. Break them up.

        • dnissley4d

          State the terms! Maybe come up with the US gdp % growth bump you would imagine we get for each of these companies being broken up?

        • UncleOxidant4d

          Break them up along functional lines or just break them into multiple smaller entities that are each still doing pretty much what the larger one was doing?

        • zombiwoof4d

          Maybe Microsoft

          • echelon4d

            Apart from their gaming department absorbing every big competitor, I can't think of ways Microsoft is abusing monopoly power like the rest of the mentioned companies.

            • autoexec4d

              How about turning Windows into an advertising and data collection platform. Sucking up vast amounts of people's personal data, even corporate data, to leverage against their customers so that MS can take in more profit.

              • globnomulous4d

                I'm much less concerned about this -- as far as I'm aware, the data collection is standard telemetry that any software provider would want, and the stuff related to advertising is, I think, largely concerned with app recommendations in the Microsoft store -- than I am about anti-competitive bundling practices.

                • autoexec4d

                  Microsoft may call what they collect telemetry, but it goes way beyond what I'd want or expect to shared with a bunch of strangers at whatever company made the software I'm using. It'd take a whistleblower (or more likely multiple whistleblowers) for us to know everything MS is doing with that data, but they've been shoving ads all over the OS for a long time.

                  I'm struggling to think of a part of the OS that hasn't had ads shoved into it... the terminal I guess... There have been ads in the start menu, the lock screen, in pop up notices, in the file explorer, in search results, in the control panel, on the task bar, in the share pane, in windows update and in a bunch of windows apps like ink workspace. They've even just force-installed random programs to people's systems.

                  • globnomulous4d

                    That (edit: second paragraph) is fair and a good point. Maybe Android has desensitized me to advertising and crap-/bloatware.

                    I'd wager, too, that the addition of the garbage you're describing has coincided with the OS's worsening performance. File Explorer performance is so abysmal that it may as well be an Electron app.

                    On the other hand (edit: regarding your first paragraph), Microsoft seems very serious about not falling afoul of the law, probably because of the cost of the anti-trust litigation they faced in the 90s and 2000s(?). It wouldn't surprise me at all if there were nothing for a whistleblower to blow the whistle on.

            • fhd24d

              They're the OG among big tech when it comes to abusing monopoly lower, e.g. during the first browser war.

              • fijiaarone4d

                After IBM & AT&T

              • 01HNNWZ0MV43FF4d

                They've been doing it so long that a generation of programmers grew up under moderate MS

            • RajT884d

              Hey remember that time MSFT got hauled in front of Congress for shipping IE with windows and making it default?

              Yeah, they do way worse shenanigans now.

            • Teever4d

              Uh... aren't they able to do that by leveraging their monopoly in other industries?

              That's the definition of monopolistic behaviour that almost got them split up 25 years ago.

              • nradov4d

                Which other industries does Microsoft monopolize? They used to have a personal computer OS monopoly but there is a least one viable competitor, and personal computers have become less relevant as many tasks have shifted to mobile devices. My flagship smartphone cost more than a typical Windows PC.

                • Teever4d

                  They hold a dominant position in desk operating systems, word processors, and spreadsheets.

                  • nradov4d

                    But not a monopoly in any of those categories. Google Docs and Sheets are kind of garbage compared to Microsoft Office if you need to do anything complex or work with large files, but for many users they're good enough.

                    • Teever3d

                      How is this different from 1998 when Microsoft was accused of being a monopolist?

                      Apple was selling a desktop operating system that was a competitor to Microsoft Windows, Corel WordPerfect was a thing and so was Lotus 1-2-3.

                      • nradov3d

                        It's different because in the previous antitrust case, Microsoft was accused of product tying. They apparently had private APIs built into their operating systems which only their own applications were allowed to use, and weren't available to competitors. There is no such allegation today. Any third-party vendor can write Windows applications that work just as well as Microsoft's own applications.

                        Apple also had a much smaller desktop OS market share at the time. They nearly disappeared but are in a much stronger position today, which makes it harder to argue that Microsoft has a monopoly. There's no strict threshold for market share in these cases, but it's one of the factors taken into account.

                        • Teever3d

                          Just because they aren't tying their products now doesn't mean that they aren't abusing their monopoly position and leveraging it to boost up their non-monopoly position in other markets.

      • nicbou4d

        At the same time, ZIRP completely distorted the market and gave us multi-billion dollar companies that never had a workable business models, but filled offices all over the world with useless employees. I really enjoyed having a chief happiness officer, but I don't think the company needed it.

        • coffeebeqn3d

          It’s easy to have happy employees when you have infinite money and no pressure to ever build a sustainable product. Personal experience. Things do take a very sudden and powerful turn when the bubble bursts

        • nyarlathotep_3d

          Yep. This is part of the core of the unspoken reality in this article--there were/are way too many people, even in nominally "technical" roles, for a well-adjusted market. A large portion of this employment is a direct result of the "prosperity" of the ZIRP era, which has now ended.

      • Bukhmanizer4d

        Yeah I think there are a bunch of different reasons that corporations are doing less, interest rates are a major factor, but the point is that it isn’t exactly the “AI is taking our jobs” narrative that people want it to be.

        • walterbell4d

          Starting in 2022, US companies could not deduct SWE salary expenses in the same year, only over 5 years like hardware CapEx. For big companies, this will roll over in 2027. Meanwhile, LLM expenses can be written off immediately as OpEx.

          https://news.ycombinator.com/item?id=37494601

          • trollbridge3d

            This is probably the single dumbest aspect of US tax policy.

            If you hire Canadian software engineers, you can dodge this and deduct the expenses in your Canadian subsidiary. If you outsource software dev to another company you can usually get away with expensing it.

            • __turbobrew__3d

              That is a good point, my company recently expanded to Canada pretty aggressively and I wonder if this was a factor.

              Historically there has been a brain drain from Canada to the US, but if Canada can set up favourable policies for companies maybe they can start reversing that.

            • walterbell3d

              Research outside the US = 15 years amortization?

              https://www.grantthornton.com/insights/alerts/tax/2023/flash...

                The TCJA amended Section 174 by removing the option to expense SRE expenditures, instead requiring taxpayers to capitalize and amortize SRE expenditures over a period of five years (attributable to domestic research) or 15 years (attributable to foreign research)
              • trollbridge3d

                You create an “arm’s length” Canadian subsidiary that then licences the software to you.

                The Canadian government also heavily subsidises this. Smart of them to do so.

                • nickff3d

                  AFAIK the Canadian government’s ‘subsidy’ is to allow R&D expenses (but not investments) to be fully deducted in the year they are made. This seems analogous to the old rule in the USA.

                  • trollbridge1d

                    They do more than this. In Alberta we got a 50% credit on salaries paid if we hired people with oil/gas skills into a non oil/gas company. Think ML engineers who have been dong advanced data science for geologists to find oil coming over to tech startup… and the government pays half their salary for you, which is already cheaper than America. And you get to fully deduct it.

                    And our government is busy prattling on about putting tariffs on Canadian maple syrup or something…

          • emgeee4d

            great point, thanks for sharing

        • UncleOxidant4d

          It seems like we're in kind of a lull that's being caused by expectations of what AI is about to be able to do. And these expectations could be completely wrong. But it's already causing changes in corporate planning and spending.

        • nickff4d

          Agreed! Uncertainty to do with short term economic issues, medium term geopolitical risk (Trump, China, Russia, etc.), and longer-term issues (national debt, Medicare, etc.) also shouldn’t be underestimated.

      • jayd163d

        It also feels like the investor class has lost the ability to make strong/informed bets after the decade of cheap slot pulls.

    • csomar4d

      I have a second hand experience with this. A friend trying to open a quickbooks account. He created the account successfully, his card was charged but then couldn't login as his account essentially doesn't exist. Contacting support was useless because they can't find his account either (though one was able to locate the charge and refund it).

      Tried it again and hop the same issue. Now he is going for a chargeback. There is nobody in quickbooks that can solve this problem as most of the support (from India) seems to be there just to re-read manuscripts.

      But hey you should buy into the stock as they are going into an AI transition: https://www.tipranks.com/news/intuit-stock-nasdaqintu-layoff...

      • idrathernot3d

        Intuit is a perfect microcosm of everything wrong with the western status quo. Complete disregard for their customers, corporate focus on regulatory capture and other forms of rent seeking, product updates that reduce functionality from prior versions accompanied by price hikes.

    • prisenco4d

      The AI takeover of the startup space makes me feel a bit crazy because there are still thousands of world-changing app ideas that have zero to do with AI but nobody's funding or building them.

      We can't possibly have run out of consumer app ideas in a decade or two, right?

      • sho_hn4d

        Honestly, as a guy working in open source at the time, the massive brain drain when crypto came along and young people wanted to get rich instead of solving problems was increadibly dispiriting and frustrating. Seen in comparison, the AI stuff has a much bigger application space, is building tools, and is far less frustrating to me.

        • nicbou4d

          Not only get rich, but do so in the least productive, most destructive way imaginable.

          • boringg4d

            Totally vapid class of people working in there minus a few notables.

      • Macha4d

        Honestly even before it was AI, it felt like everything was blockchain or "Uber for X". The majority of startups have been trend chasing for a while.

        • baby_souffle4d

          To a broader extent isn't this kind of the history of silicon valley ever since the dot com bubble?

          For every small startup trying to build innovative robotics to solve a healthcare or agriculture problem, there's 10 startups getting 100x funding because they figured out how to put jpegs on the blockchain and the last guys that figured out how to do that had a nice exit...

          I forgot what the economists call this but it's the characterization of housing market. The value of your current house is determined by the last few local sales and little else. All of the startups using the current in Vogue technology feel like that...

          • bradlys4d

            This is exactly how Silicon Valley works because this is how investors work. Investors aren’t really looking for true innovation - they’re looking for an idea that will let the next sucker take the hit. When one company is doing very well, clearly there’s some suckers out there buying the stock. Therefore, if we have a stock that is similar then maybe we can also get suckers to buy the stock.

            That’s all it is. That’s part of why Silicon Valley is so clout chasing and cargo culty. It’s entirely due to investor pressure to get immediate returns. Immediate returns mean you have to follow whatever the current hype is.

            Is it IoT, crypto, nft, Uber for X, self driving, etc. etc.? That’s what you do. You follow whatever the hype is and bail after you get your desired return.

            There’s no desire for a sustainable business. There’s a desire for other investors to be a sucker that holds the bag at the end.

            • maccard4d

              > You follow whatever the hype is and bail after you get your desired return.

              > There’s a desire for other investors to be a sucker that holds the bag at the end.

              I agree with everything you said except for these two sentences. If you take VC funding to build a sustainable business frankly you’re doomed from the get go. That’s not what a VC wants and it won’t get you funded. There are other routes to that.

              What VCs want is if 99 of those startups fail, to be holding the Airbnb, uber, Anthropic at the end. Because holding that from the beginning will make you more money than any other option.

              • bradlys4d

                The stock market is mostly based on the greater fool theory. You can’t really escape that.

                Most VCs doing 99 investments for the 1 big are akin to YC. They’re not doing series C for $100m and expecting 99 of those to fail. They’re expecting to get a return somewhat shortly back.

                Holding the stock isn’t helpful for a VC unless they’re going to be using some financial mechanism for leverage - which means they’ve given it up for the other institution who now essentially owns it. A lot of these stocks aren’t giving you meaningful dividends. You have to sell or give up some form of control on them to be a successful VC. How else would you continue to invest?

      • bamboozled4d

        I wonder...what sort of world changing apps do you think we're expecting to see?

        I think about it quite often and there is a LOT of apps out there, and really, humans don't need that much to be happy.

        • Bukhmanizer4d

          > there is a LOT of apps out there, and really, humans don't need that much to be happy.

          There’s more music out there than there’s ever been. More tv shows and movies than I could possibly watch, but I still find new things to watch and listen to.

          But tech? Maybe other than my Robot vacuum, I don’t think there’s anything in the last 5 years I’ve seen that I’ve felt is going to make my life easier or better. Which seems odd because the pace that technology seems to be improving only seems to be accelerating. We can do more than we ever could before, but it feels like the appetite to improve things is no longer there.

          • YetAnotherNick4d

            It takes lot more than 5 years for people to recognize the need and the product to deliver. It took desktops and smartphones and internet decades before it got out of "I don't need it" zone. Now you need it. Even robot vacuum was there for decades.

          • bamboozled3d

            Driver assist technology has been a pretty amazing improvement, but it's also getting "old".

        • UncleOxidant4d

          And human attention is a limited resource that just might be tapped out at this point.

      • pram4d

        How quickly we forgot the “Web 3” revolution!

        • jongjong4d

          Well it was a scam. The ultimate scam. Literally, they only supported scams and suppressed projects with real potential. I say that as someone who worked in the space and saw it turn into a scam... Started back in 2017.

          • __MatrixMan__4d

            Ultimate, you say? I think the future for scams is looking bright. Scams will one day look back at web 3 and reminisce about their quaint beginnings.

            • jongjong2d

              Ah yes I exaggerated there. The global debt-fueled fiat monetary ponzi is the ultimate scam, crypto just plays a supporting role in giving people false hope that you can fix corruption with corruption.

          • kulahan2d

            So who is this specter that somehow and for some reason killed off all useful projects and made sure there was no way to even exploit it for unscrupulous profit?

      • morkalork4d

        It feels like we ran out in the first decade since the iPhone (2007-2017) and have been running on fumes since. Can you name some world changing consumer app or company outside of AI fluff that came to be in that post 2017 to end of 2022 period when chatgpt was released?

        • Spooky234d

          There’s a lot of B2B stuff.

          How much easier is it to manage and operate technology in 2025 than it was in 2005 or 2015? I have three core tech teams with 12-18 people. I’d need 500+ to do what I do today in 2005, assuming the tech could do it.

          Breakthrough B2C products don’t appear annually. But everything is better. Apple Maps can estimate my travel time for a 300 mile drive with 5 minutes. I bought a last minute flight to Rome last summer knowing nothing about Rome or speaking any Italian and I did fine, thanks to iPhone and the mobile app ecosystem.

        • pedalpete4d

          I think this is the cycle. Take a look at the S curve, things move in jumps with leveling out/adaption periods in between.

        • AbstractH244d

          Slack, Zoom, TikTok

          And those are just off the top of my head

          • robenkleene4d

            Those all predate 2017:

            - Slack: 2013

            - Zoom: 2011

            - TikTok: 2016

            Based on "Initial Release" on Wikipedia.

          • singpolyma34d

            Neither slack nor zoom did anything new or interesting. Just a clone of stable tech that worked out in marketing

            • baby_souffle4d

              Maybe they didn't have innovative functionality initially but the user experience was certainly better.

              Microsoft fumbled Skype and nobody was putting together convenient apis on top of IRC...

              • entropicdrifter4d

                Right, so iterations that ate up the old-school competition, not revolutions

                • brookst4d

                  What was the last major tech company that was a revolution and not mere iteration?

                  • samrus4d

                    Maybe Netflix?

                    • brookst3d

                      That’s a good one. Which suggests Amazon as well; similar revolution in distribution and scale.

                      But there were companies shipping DVDs before Netflix, and also companies streaming movies online before them. So really a marketing / operations / distribution revolution.

            • zombiwoof4d

              Slack was skype

              • kulahan2d

                Absolutely not.

                Skype figured out how to build an integrated login experience. Every morning using slack I keep having to remind myself it’s not early FOSS!

        • quantified4d

          Tiktok was released in its current form in 2017, I believe. Is that before the period you're asking about?

        • KoolKat234d

          Sad to say: Tiktok, Only Fans, Discord

          • morkalork4d

            According to Wikipedia, Discord is 2015, Only Fans and TikTok are 2016.

            I'm as surprised as you are, I'd like to be proven wrong here!

          • forgetfreeman4d

            But these are all marketing wins. The underlying technology (let people post shit and talk to one another) had been stable for like a decade before they came along.

            • daedrdev4d

              No, they are product development wins. Discord is far more popular than alternatives because they provide a far superior product that also has many normal features that their competition somehow fails on, like voice chat, video chat, steaming, images and videos in chat that don't disappear, spam protection and good moderator tools, etc.

              • kulahan2d

                It’s an incredibly buggy experience with a user interface that fails many basic design cues (like not hiding things until you mouse over, not making the friends list icon inexplicably identical to a server icon and also the add server icon, etc.)

                It’s popular because they figured out and unbelievably low-friction method to get users into the ecosystem. Being able to do everything online without installing anything was huge at the time. You had an account in like 2 clicks. THAT was a thing of beauty.

            • prisenco4d

              | The underlying technology ... had been stable

              That's my point. There are still creative ways we can use or improve stable technology to build new consumer applications.

              I'm not seeing much creativity lately.

              • forgetfreeman4d

                No argument there. Compared to Xerox-Parc's heyday the Bay Area has felt like a case study in cargo cults since before the dotcom implosion.

    • Spooky234d

      It’s not unique.

      We’re coming out of a long growth period fueled by two decades of war and the inflation that came with it, first in asset values from ZIRP and then from the COVID capital infusions.

      Look at where we sit economically, at least in the US. Real estate, a core economic and political engine is a bomb waiting to go off. Commercial real estate is totally underwater. Residential real estate is in another bubble. I won’t go into the madman in DC that’s gonna light the fuse.

      Everyone knows it at some level, so projects are getting cancelled. Changes add value but create problems. Take away change and the implode demand for labor. Avoiding the need for marginal later lowers the marginal cost, so you start purging expensive people.

    • noneeeed4d

      I feel like a portion of it is the way that many/most companies have been captured by their finance departments. Everything is accounting. The CFO in many organisations has become the most powerful exec, and many CEOs seem to come up through the CFO role.

      Outside of places like Meta, who are printing money at a ridiculous rate, finance acts as a break on any long-term or big bets. There can be no risk taking.

      I feel like this is one of Google's problems now. Once upon a time they were willing to take big swings with their piles of cash, now it's all about revenue maximisation at the low level. I forget which change it was, but they started charging for something, or limiting quotas on something, and the email contained the phrase "in line with industry norms", and I just thought that was very tellings. Back in the early 2000s Google was constantly defying and upturning "industry norms", now they are just like everyone else, squeezing every last drop from the smallest stones. Getting rid of the previously grandfathered in free Google Workspaces was a good example. I find it hard to imagine that the cost of those even registers in their accounts compared with everything else.

      • bell-cot3d

        Sounds all too similar to General Electric, pre-Jack Welsh...vs. General-Electric, post-Jack Welch.

        The first prioritized engineering 'most everything that a modern nation might need. The latter prioritized only engineering its own financial statements.

        Both did very well...at least at their top priorities.

        • BeFlatXIII3d

          As they said in The Wire, “We used to make things in this town. Now, everyone just sticks their hand in the next guy's pocket.”

          • noneeeed3d

            a) That is a fantastic way to put it. It does feel like the whole world seems focused on just moving money around

            b) I really must make the time to watch The Wire.

            • BeFlatXIII2d

              The Wire is absolutely worth the ~61 hours it'll take to watch all 5 seasons.

    • mrweasel4d

      Isn't there also the possibility that the US job market simply isn't doing to great at the moment and businesses are adjusting, or preparing for a downturn. In the case layoffs would be across the board, not just entry level or in the service industry. Perhaps the service industry and blue collar jobs have already been trimmed to the limit and knowledge work has simply been lacking. As pointed out, it easy enough, at least for a limited time, to simply not do stuff and just attempt to coast for a few quarters/years.

      If it was truly a decline in knowledge work in general, it should be visible in other economies as well, but I don't think that's the case, at least not here in Denmark. Arguably we typically trend a bit behind the US, so it could be looming on the horizon.

    • giantg24d

      Yep, my company used to be know for service quality. We've seen an increase in negative feedback. The response from management is that it's only a small percentage, so it's nothing to worry about and that we'll focus on AI instead.

      • nradov4d

        It depends on the industry but a lot of the concern over service quality was always "cargo cult" management than anything really rational. Google (Alphabet) has been enormously profitable for decades despite the fact that their customer service is famously terrible or nonexistent (unless you're a whale of an advertising customer). Most of us don't make purchasing decisions based on service quality.

    • carlmr4d

      I think we'll have to start asking "AI, and...?", i.e. the question "what's your idea, besides AI". Everybody seems to talk about AI at this point as if it magically solves all issues we have. Even if it does, which I currently still doubt, it won't be a differentiator for your company. We have to ask, beyond AI, what's your idea?

      If your idea is only "use AI for X" your company doesn't have anything really. At least if AI isn't its core business like OpenAI.

    • wvh3d

      I've always admired people that get out before the shit gets anywhere near the fan instead of staying glued to their seat in quiet despair and apathy, grasping at something that feels like security and stability. If you're really honest, you can feel energy, enthusiasm, interest, ideas floating around, even hope and life force in some projects and companies. All too often that dies at some point, leaving only zombies behind (most of us).

      There is of course the trend of influencers, quiet quitting, gig economy and people going to work for themselves. That assumes that there is enough capital floating around for everybody to survive on their own... If you can't land a decent job in an established company, maybe you should think twice about being able to float above the waterline when you go at it by yourself.

    • softwaredoug4d

      Well in a ZIRP environment every department head imagined the thing they was building was a massive growth opportunity. Even if they were basically refactoring a part of the app the instinct was to build an empire. Now it’s do the minimum a lowest cost.

    • oxfordmale4d

      It is part of the regular economic cycle. Most companies try to do more with less, like in any downturn. This cycle is different, as senior management now has AI to justify hiring fewer knowledge workers. Things will turn around eventually when the AI bubble bursts. Companies will then scramble for graduates who have likely moved to work in different industries and complain about a talent shortage.

      • YetAnotherNick4d

        > senior management now has AI to justify hiring fewer knowledge workers

        Justify to whom? Shareholders just care about metrics like earnings and revenue. If they don't need workers to optimize this metrics, they have right to hire fewer workers.

        • oxfordmale1d

          The number of employees you hire is seen as a proxy for growth and future earnings and revenue. With AI the argument is that you can grow with less staff.

    • Cthulhu_4d

      I suspect there will be a whip effect in a few years when companies realise they neglected their usual operations / innovations / maintenance work and they need to rush to catch up again.

    • andyst4d

      I somewhat resonate with this. Working in mature industries where competition has condensed down to larger players with enormous scale and embedded product distribution, don't need to compete _much_ on innovation or product, just on price.

      I don't think specifically AI has done this compared to a broader view of constant stream of digitilisation of every departments function.

      Orgs don't need grads to learn the ropes from the bottom and make their way up the career ladder, when the ladders might only be 3-4 rungs high now.

    • seydor4d

      That's normal. You initially need a lot of workers to find the vein of gold, but after you find it you just milk it endlessly

    • KurSix4d

      It's like the focus has narrowed to chasing the next big disruption instead of steadily improving what already works

    • geye12343d

      Two words: Jack Welch.

      Every conversation of this sort should involve his name being shouted from the rooftops.

    • didibus4d

      Not just less in that sense, but also, they are no longer focused on growth, so the question is, how do we maintain the current offering but lower the cost of operating it. You need less people to maintain, and by losing people you also reduce the cost.

    • timewizard4d

      > is just that they’re doing less.

      The loaded term for that is simply "cutting corners."

    • james-arg3d

      Very much this.

    • squigglydonut4d

      they're dying

  • fullshark5d

    Bachelor Degrees need a complete rethink, it was basically modified finishing school for rich capital owners, needing to make their children of proper class before they could take over their businesses.

    It then became a vocational degree for the working class, despite being completely detached from useful skills for a wide swathes of degrees. The only value is that you could talk the talk and become a member of the professional managerial class if you impressed the right hiring committee/individual.

    In spite of this, we decided the working class should take out crippling loans to pay for this degree, and be in debt for the rest of their working life.

    It's not sustainable, and just forgiving the debt only will make it all more expensive and less aligned with actual results we desire (useful workers).

    • jltsiren4d

      The liberal arts model was intended for the elites, and the idea of education as means of producing useful workers is straight from a totalitarian planned economy. But there is also the Humboldtian model of higher education, which focuses on educating informed citizens who are free to make their own choices. Learning vocational skills is easy enough if you have good education, and it's also necessary to be able to do that outside school, as careers rarely last a lifetime.

      American higher education is expensive, because you chose to defund public universities. And because you have an unhealthy obsession with rankings and top universities. Those are the things you need to change more than education itself.

      • chermi4d

        Another reason they're expensive is because the government basically guarantees they'll pay for whatever tuition the universities set, so they raise prices. There's lot of other factors, including administrative bloat, and maybe partially from less funding... But they're still funded in large part by the government via loans. I don't know the right answer, but I know it's not simply more government funding.

        • downrightmike4d

          They also don't pay taxes because of a supreme court decision in favor of MIT for being a "Charity" and reducing tuition for some. Then every Uni piled on and decided to use a lot of that money on sports.

          • beacon2943d

            Historically sports was self paying and a massive benefit. I don't know what the score has been for the last 2 decades.

            • downrightmike3d

              When everything else is starving, it doesn't make sense to keep sports constantly upgrading

      • blueflow3d

        > and the idea of education as means of producing useful workers is straight from a totalitarian planned economy

        I disagree. I get food from the supermarket, my roof is built by someone else, unlimited water from the faucet. This stuff isn't coming for free, it is other peoples work. Of course i want to learn something useful to contribute back. But western societies don't seem to have a "We need X people with Y skillset" institution. If someone came to me like, "we need a welder to produce $needed_thing" then i would have put my skill points into that.

        • johnnyanmac3d

          So what is more efficient for this? : 4 years going in debt to get a well rounded education with a concentration in learning maybe 1.5 years of welding (in theory): or 2-3 years as an apprentice learning your specific trade and focusing on your one task (not in how many credits needed to graduate)?

          We had this structure with apprenticeships. Companies were the ones to say "we need x people with Y mindset". And they can pay to foster those people and mindsets.

          But they abandoned that because they didn't want to fit the bill for their own workforce. They instead put up with mediocre welders they kinda sorta train for 6 months and maybe the good ones stay. Great model for society.

          • xnx1d

            > they abandoned that because they didn't want to fit the bill for their own workforce.

            Exactly. Privatize profits. Socialize costs.

      • jselysianeagle4d

        > American higher education is expensive, because you chose to defund public universities. And because you have an unhealthy obsession with rankings and top universities. Those are the things you need to change more than education itself.

        There's this and also the massive budgets for college sports and fancy student housing that make it worse.

        • nradov4d

          The massive sports funding is mostly only a thing for D1 schools and it doesn't take much funding away from academics. The sports mostly pay for themselves through ticket sales and media licensing, with the big chunks of revenue coming from men's football and basketball. Sports also drive a lot of alumni donations.

      • KurSix4d

        Defunding public institutions while glorifying prestige has created a system where access is limited and value is warped

      • gruez4d

        >the idea of education as means of producing useful workers is straight from a totalitarian planned economy

        ???

        • delusional4d

          It's a worldview that necessitates only teaching what leads to a job. It is anti beauty and anti human.

          We are not machines of production. We are human beings, we deserve to learn stuff that doesn't help us produce.

          • RevEng4d

            Why not both?

            If I want to learn how to do something, I want to be educated in that. There should be education available for that, but for knowledge workers, there largely isn't; instead, university has been hijacked to fill that role. In the trades there are schools dedicated to learning these topics and they work well at helping people learn to work effectively.

            Universities should be a place of higher learning and research - those are really important too. The struggle in universities today is that they are expected to be both and those things aren't really compatible. That's why you get research professors begrudgingly teaching undergraduate courses.

            We need an alternative to universities for learning the trades that knowledge workers tend to do.

            • jltsiren4d

              Some countries have alternatives. For example, Finland has institutes that call themselves polytechnics, universities of applied sciences, or something similar. ~60% of higher education takes place in them, with the remaining 40% in research universities.

              However, when similar fields of study are available in both types of institutes, employers almost universally prefer graduates from research universities.

              • nradov4d

                We have the same thing in the USA. Some schools like Cal Poly or Texas A&M have an explicit focus on learning by doing rather than knowledge for it's own sake.

            • johnnyanmac3d

              I don't know, why not both? Companies killed apprenticeship and even training programs, so they took your choice unless you're going blue collar. Even with blue collar, Unions more or less took a lot of potential spots for new blood, so it becomes a schmooz game for something you want to take your time and effort to learn.

              So yea, surprise that once again it comes down to companies that don't want to pay for the talent they yearn for.

          • woah3d

            Anyone who wants to can learn as much about the humanities as they want to these days without a college degree, with audiobooks, discussion forums, etc. I myself have done this.

            Charging the government $50k a year to force hungover 18 year olds to plagiarize essays at 7:45am before class to get a required credit benefits no one.

          • newAccount20254d

            Beautifully put. It feels like a view we sorely need as our corporate overlords become AI enabled.

          • naasking3d

            > It's a worldview that necessitates only teaching what leads to a job. It is anti beauty and anti human.

            The question is always who is footing the bill while you find yourself? Education that leads to productive vocations can arguably have some public funding because that gets paid back in taxes, etc. If you're paying for expanding your own horizons through education, then sure, have at it.

          • Supermancho4d

            > the idea of education as means of producing useful workers is straight from

            ...every social structure throughout time. An educated populace is more likely to be successful in achieving goals than re-learning from nothing.

            > It's a worldview that necessitates only teaching what leads to a job.

            Since education is generally funded, it makes sense that subjects are self-reinforcing. "Worldview" is, contextually, an anthropomorphism of a social process. The cause, not the effect, is people doing what people prefer to do. There's nothing outside of society pushing. People still make art, to their parents' chagrin.

            The situation of forced subjects, is not rigid. There is education for other subjects; almost everything or anything that meets some arbitrary qualifiers. Crystals, dog breeding, board game design? Humans tend to share knowledge (even when it's wrong). There's no shortage of learning in subjects that aren't primary subjects in some accredited college.

          • pembrook4d

            You can certainly go overboard on the “teaching only practical skills” thing, but id argue the pendulum has swung way too far in the other direction.

            The point of teaching useful skills is so people can be helpful to other people. While coded as right wing this is fundamentally pro-social. Finding a useful and needed place for yourself in society tends to result in more stable families, communities and mental health.

            Spending decades indulging your own ego tends to do the opposite.

            • AshleyGrant3d

              But this presupposes that they only "useful skills" are those that are economically useful. There are plenty of artistic skills that produce products that are not generally of high economic value but are helpful to society by making life nicer to live.

              There are statues that I go by every day that generate zero revenue for the city, but they make getting around town just a little bit nicer. Nobody is coming to this city to see these statues, so they don't drive tourist revenue. They're just there and nice to look at.

              The artists made some money, but there's no return on investment in straight dollars for the city that commissioned these works. And yet it is valuable to society.

            • Avicebron4d

              > "teaching useful skills is so people can be helpful to other people. While coded as right wing"

              I'm apparently living deep underneath bedrock and/or have spent too much time worrying about [EDITED] "weird math"

              When and where/how did the "helpful to other people == coded as right wing" happen? And does that mean people are finding fault in the first half because of the second?

              • pembrook4d

                The current zeitgeist according to chatgpt in the west:

                Right-wing coded: Practical education, Hard skills/trades, having children, personal responsibility, entrepreneurship, exurban community orientation, traditional morality/religion, decentralization (less elite control), bad aesthetic taste, low emotional intelligence, etc.

                Left-wing coded: Aspirational education, Universities, staying single into your 30s, personal irresponsibility (victimhood), artistic ego pursuit, urban orientation, performative morality/religion (climate doomerism, veganism, social justice-ism), centralization (more elite control), good aesthetic taste, high emotional intelligence, etc

                • skyyler2d

                  Your comment according to ChatGPT "in the west":

                  Prompt: { Someone posted this:

                  The current zeitgeist according to chatgpt in the west: Right-wing coded: Practical education, Hard skills/trades, having children, personal responsibility, entrepreneurship, exurban community orientation, traditional morality/religion, decentralization (less elite control), bad aesthetic taste, low emotional intelligence, etc.

                  Left-wing coded: Aspirational education, Universities, staying single into your 30s, personal irresponsibility (victimhood), artistic ego pursuit, urban orientation, performative morality/religion (climate doomerism, veganism, social justice-ism), centralization (more elite control), good aesthetic taste, high emotional intelligence, etc

                  Does this sound like something you'd say, chatgpt? }

                  ChatGPT: { No, that post is not an accurate reflection of how I analyze political or cultural dynamics. It presents a highly polarized and caricatured view of different ideologies, using broad generalizations and subjective judgments rather than a nuanced or objective perspective.

                  If I were to discuss cultural or political trends, I would aim for a more balanced approach, recognizing the complexities and overlaps between different worldviews rather than reducing them to a list of traits with implicit value judgments. Real social and political identities are more fluid and context-dependent than what that post suggests. }

                  Response: { Can you expand on how it's a polarized and caricatured view? }

                  ChatGPT: { 1. Using Broad Generalizations It treats political ideologies as monolithic and assigns rigid, opposing characteristics to "right-wing" and "left-wing" identities. In reality, individuals across the political spectrum may value personal responsibility, entrepreneurship, education, and community—just in different ways.

                  2. Assigning Subjective Value Judgments The post implies that one side has positive traits (e.g., "practical," "hard-working," "responsible") while the other is marked by negative traits (e.g., "egotistical," "performative," "irresponsible"). This makes it less of an analysis and more of a partisan framing.

                  3. Over-Simplifying Political and Cultural Attitudes Not everyone fits neatly into the categories presented. There are libertarians, centrists, populists, progressives, social conservatives, classical liberals, and many more who defy these stereotypes.

                  4. Framing Complex Issues as Binary Opposites Instead of acknowledging nuance, the post forces everything into a binary left vs. right framework, when in reality, people’s values often overlap.

                  Conclusion The post is more of a stereotyped narrative than an accurate representation of political and cultural trends. It distorts reality by exaggerating differences, dismissing complexity, and reinforcing partisan divisions rather than fostering real understanding. }

                  Really fascinating stuff if you ask me.

                  • pembrook2d

                    I prompted GPT for stereotypes of the current left/right culture wars, and simply edited down the LLM weasel language and wordy equivocation on each bullet point in the novel it wrote.

                    Fascinating indeed.

                    • skyyler1d

                      It seems your editorializing had a bigger impact than you thought!

            • hackable_sand3d

              What pendulum???

              • immibis2d

                The political pendulum, maybe? Good times make weak people, weak people make bad times (you are here), etc?

      • nukem2224d

        > useful workers is straight from a totalitarian planned economy

        I'd still strongly prefer a totalitarian, planned economy. Hell it sounds like a straight-up pollyannish fantasy at this point. Watching people expect the chaos of building our country around the right-to-be-a-dick-to-others to result in a functional country makes me want to burn everything down.

        But making your own labor force pay for their education? Straight suicidal.

      • rendang3d

        Subsidizing a service with public funds does not make it less expensive, it only changes who pays the expense.

        • wormlord3d

          What is economy of scale? What is induced demand? What is return on investment? What is increased total innovation?

    • KPGv24d

      > despite being completely detached from useful skills for a wide swathes of degrees

      It's a nice suggestion, but it's one that isn't supported by the evidence. Even controlling for other factors, a college degree makes more productive workers. And given that it's controlling for other factors, "selection bias" becomes a hard argument to make. STEMbros get real arrogant about their degrees (I have one; I've seen it first hand), but like it or not the person with an English degree still learned a lot of useful skills.

      Going to uni to major in a specific career is how you get screwed when available careers change.

      • JamesBarney4d

        > Even controlling for other factors, a college degree makes more productive workers

        I'd like to see this study. Most of the data I've seen that is pro-college still has massive confounds.

        Two twins graduate high school. One gets a crappy copywriting job, and spends her free time reading books on how to write better, and specifically how to do copywriting. The other gets an English degree. I'm not nearly as confident as you are the one with the English degree is going to be a better copywriter.

        I don't disagree you can learn skills, but cognitive science literature solidly shows far transfer is not a thing, and when it is it's incredibly inefficient. i.e. Reading the great works of Russian literature might make you a better copywriter but at a vastly slower pace than writing copy, or reading a book on copywriting.

        • Clubber4d

          I think an undervalued aspect of college over self learning for most is that college requires you to learn a broader array of things. If I was allowed just pick the classes I wanted to take for four years, they would have all been computer related classes. I would never had taken Chemistry, Physics, Drama, Psychology, History, International Relations, or anything that makes me a more educated and well rounded thinker.

          • JamesBarney4d

            We look at that positively becomes it's a sunk cost, but if someone were to put the circumstances in front of us would we still be happy about it?

            If your job told you that you had to pay 10k to study French poetry for 6 months without pay, would you happy for the opportunity to a more well rounded thinker?

            • Clubber4d

              >If your job told you that you had to pay 10k to study French poetry for 6 months without pay, would you happy for the opportunity to a more well rounded thinker?

              That's unrealistic.

                1. That's just one subject.
                2. Semesters and/or quarters are not 6 months.
                3. Most colleges don't charge $10K for one subject for 6 months.
              
              Having said all that, yes there were some annoying class requirements that I didn't particularly care for, but I still learned something from it. If nothing else, it allowed me to have more conversations with a broader array of people. Make no mistake, being an educated conversationalist impresses people. Being able to intelligently converse with more people because of a broader knowledge base has significant social and economic value. Imagine if you met your perfect mate by being able to talk about French poetry. Imagine your perfect company has a hiring manager who also loves French poetry. We are ultimately social creatures.
              • cdnthrownawy394d

                What about lost opportunity cost though? Or the cost of failure?

                My college intake class had 60 enter in the first year, and I was paying $2000 out of pocket a semester. Only about a quarter got the paper in the end; the rest either withdrew or flunked out. And from what I understand, 75% starting but failing to achieve a degree isn't atypical for that course.

                What happens to those students? The ones that take on debt and then realize far too late that they can't handle the workload and have nothing to show for it but student debt, years of lost income, and no degree to show for it?

                And even for those of us that made it though, I'm... still uncertain it was worth the price. Not just in dollars.

                • Clubber4d

                  >What about lost opportunity cost though? Or the cost of failure?

                  Well for a high school grad today, most likely the opportunity cost is minimum wage service sector jobs. Those "temporary" jobs are also really easy to get stuck in for life. That's not a good life.

                  >Only about a quarter got the paper in the end; the rest either withdrew or flunked out. And from what I understand, 75% starting but failing to achieve a degree isn't atypical for that course. What happens to those students? The ones that take on debt and then realize far too late that they can't handle the workload and have nothing to show for it but student debt, years of lost income, and no degree to show for it?

                  They're fucked, but at least they know where they stand. They should know their chances with the standardized testing somewhat. You definitely shouldn't go to college if you don't have the aptitude or motivation, but you should go to some sort of trade school: electrical, carpentry, plumber, etc. The safest path to college is graduated study, community college for the AA, then full blown university for the last two. At least you get an AA degree, and if you can't handle community college, at least it isn't very expensive.

                  >And even for those of us that made it though, I'm... still uncertain it was worth the price. Not just in dollars.

                  The value proposition is pretty muddied with the negligent inflation of tuition over the past few decades, that is a certainty. Unless you are Ivy League, any college will do. I'm fortunate to live in a state where tuition for public college is on the lower end.

                  • lurk23d

                    > You definitely shouldn't go to college if you don't have the aptitude or motivation, but you should go to some sort of trade school: electrical, carpentry, plumber, etc.

                    Why are you assuming they would fair any better in trade school?

                    • Clubber3d

                      >Why are you assuming they would fair any better in trade school?

                      Maybe they do, maybe they don't, but if they find themselves in service work, they should do anything they can to not be in that position. Some people just don't have any other options due to aptitude or attitude, so that's where they stay. That's not a great place to be in the US for the limited existence we have on Earth.

                      Something I learned in high school sports, particularly wrestling, is you have to always work to improve your position. That lesson holds very true in life.

            • RealityVoid4d

              There was never anything I regretted learning.

              • Loughla3d

                It's an odd attitude to me that I see on this site, and is VERY prevalent among STEM majors, CS especially. The attitude seems to be that the only learning you should do relates to income and/or career advancement. Look at the story on here yesterday about monetizing, or at least publicizing your hobbies.

                I do not regret any of the classes I took. I learned something from all of them. And it's not putting a rosy shine on something I was forced to do. I genuinely appreciate being well-rounded and knowing many things. It made me a better person.

                If people just want to learn core skills, get a technical degree like a certificate or associates. That's why those things exist. You can do a 6 month certificate in coding at my local community college if your goal is just to go work and not learn things other than that.

            • KPGv23d

              > We look at that positively becomes it's a sunk cost,

              I don't care what you "look at it like." I car what the actual statistics demonstrate. Facts don't care about your feelings.

          • tayo424d

            15 years later do you think these classes matter?

            Im not totally sure I could tell you most of what I even took let alone what I learned in those classes.

            Maybe right after college there's a window of a couple years.

            • JamesBarney4d

              Many people state how much they appreciate their degree. But when it comes down to it almost everyone who goes back from a second degree does so with the specific purpose of getting a credential for their career.

              No one really values the intangible personal development that comes with education. If they did I'd have 5 friends who went back for a second degree to enrich themselves instead of 0.

              • timdev24d

                I think there's diminishing returns. A broad, liberal arts, undergraduate education develops critical thinking and reading skills in a zero-to-one kind of way. Once you've attained those skills (whether through a college degree or some other way), further enrichment via self-study is much more easily doable.

                • Clubber3d

                  >I think there's diminishing returns.

                  This is definitely true. I think pre-graduate college is pretty eye opening, at least when I went. In most high schools, they just cover the top layer of knowledge; in college they go quite a bit deeper. "They never taught us that in high school," is a saying that applies.

                  • BeFlatXIII3d

                    > "They never taught us that in high school," is a saying that applies.

                    Equally likely, the kid just didn't pay attention.

            • KPGv23d

              > 15 years later do you think these classes matter?

              Yes. A broad, liberal arts education made me a better, more informed citizen. One might even say I know more than the current President (certainly his supporters) about tariffs, what affects the price of eggs, etc.

              • eastbound3d

                If you used all that knowledge to disdain others instead of learning from them, then you have learnt nothing.

                “Yes but I’m right and they’re wrong” — When you’re 49% of the population, it doesn’t matter.

        • Ancalagon3d

          I don’t necessarily disagree with you but kind of a poor example considering copywrite work seems to be having a severe identity crisis because of AI at the moment. At least the English major still has their education to fall back on while the copywrite might have their entire skillset made obsolete with no other credentials to utilize.

      • HPsquared4d

        I don't think there is any doubt spending 4 years studying a subject will increase skills in some areas. The question is whether the benefits are worth the cost (and that question applies both to the individual student and society as a whole).

        Remember the cost of all this is absolutely massive. Mostly the 4 years of lost time.

        • 18766hahsbc7674d

          It doesn't have to be though. In Europe the vast majority of people attend public universities that don't require having to end up with a degree and crippling debt.

          I left uni almost 20 years ago, but one year of my tuition was about 1000USD at the time, something I could easily afford with a part time job. I'm sure the cost is higher now, but I would have thought it is still orders of magnitude cheaper than in the UK or the US. Germany subsidizes university tuition fees for a huge percentage of students, and adds a monthly stipend for expenses and free public transport while enrolled in uni.

          Your point is valid, challenging the worth/cost of higher education. But I think it is the cost part what is broken in some parts of the world, not necessarily the worth part.

          • wat100004d

            The cost is there regardless. If the ROI is negative then it's making somebody worse off even if the state is paying for it. If the state pays, it's even worse, because people may take the option based on a positive ROI for themselves while it's overall negative. If the ROI is positive but people aren't taking advantage because of the cost to themselves, then you want to look at subsidies.

            • delusional4d

              Sometimes "ROI" isn't measurable. What's the "ROI" of a well rounded populus that gets along and doesn't want to kill each other? What's the "ROI" of creating beautiful works of art, or producing great culture? What's the "ROI" of happiness?

              Sometimes the purpose of central government is to make a negative ROI into a positive one, because there are greater purpose than the return

              • lurk23d

                > What's the "ROI" of a well rounded populus that gets along and doesn't want to kill each other?

                Calculable.

                > What's the "ROI" of creating beautiful works of art, or producing great culture?

                The academy doesn’t do this anymore and hasn’t for more than half a century.

                > What's the "ROI" of happiness?

                Is there any evidence to suggest college graduates are happier which account for differences in childhood socioeconomic status, lifetime earning potential, and similar confounding factors?

                > Sometimes the purpose of central government is to make a negative ROI into a positive one, because there are greater purpose than the return

                While I agree with you in principle, this rationale can be used as a cover for bad policy. You take a measure that is widely accepted as an indication of worth, and then insist that the system’s failure to produce that indication of worth may actually just indicate that it’s providing an intangible goodness that cannot be measured nor accounted for (beauty, happiness, etc.).

                • AshleyGrant3d

                  > Is there any evidence to suggest college graduates are happier which account for differences in childhood socioeconomic status, lifetime earning potential, and similar confounding factors?

                  I'm not going to read this entire study just to respond to a single comment, but according to ChatGPT, the answer to your question is yes.

                  https://pubs.aeaweb.org/doi/pdfplus/10.1257/jep.25.1.159

                  https://chatgpt.com/share/67e5793f-e284-800f-8bb4-ebd340e4c9...

                  "Yes, the evidence strongly suggests that schooling has nonpecuniary benefits like increased happiness, beyond what can be explained by income or family background. But the authors remain cautious and emphasize the need for further causal research."

              • wat100004d

                It may not be measurable but we still need to think about it and do our best to figure out what things cost more than the benefits they provide.

                • delusional4d

                  What if we don't agree what the benefits are? Tieing the "benefits" to money breaks your system before you even get it off the ground.

                  We already have a way to discuss that benefit, it's the political process. Doing politics IS figuring out the benefits.

                  • wat100003d

                    You answer your own question here.

                    You don’t have to tie the benefits to money, but you do need to understand that there is a cost, some of which is money, and the benefits should be worth it if you’re going to be doing it. Who pays the money is a secondary concern to that.

          • nradov4d

            Most European public universities have low tuition because taxpayers are subsidizing them. This can only work if the number of students is kept relatively low. You can't have both cheap tuition and widespread access to higher education. The math doesn't work.

            It will also be interesting to see what happens now that Europe has to get real about defense spending instead of freeloading on US security guarantees. Since defense spending is going up there's going to be a lot of political pressure to cut higher education.

            • lodovic4d

              What supports your assertion that low tuition fees limit student numbers in Europe? This would only apply to non-EU students. I could very well claim the opposite, that high tuition fees limit student numbers in the US and make higher education out of reach for most.

              • nradov3d

                The money has to come from somewhere. Tuition doesn't cover the cost of operating those European public universities. Most of the cost is borne by taxpayers. More students would thus mean higher taxes.

                To an extent this also applies in the USA. Public universities in most states have relatively low tuition (although still higher than their European equivalents) for in-state students. But they are unable to meet market demand, so many students take out loans to pay tuition at lower-tier private universities.

        • PlunderBunny4d

          A higher education/degree teaches you how to think, not what to think. I’ve never used the knowledge from my physics degrees directly, but they taught me problem solving skills that I have applied to my career.

          Here’s a pointed take on it from one of New Zealand’s successful businessmen: https://www.nzherald.co.nz/nz/education/the-art-of-business-...

        • bobthepanda4d

          I mean we can reform it without necessarily throwing out the baby with the bathwater. In Europe a bachelor's can often take 3 years, and maybe we can even shrink it down to 2 years; this is already kind of a thing in the US with some schools offering a 5 year bachelor+master's program.

          Realistically maybe we reform community college to be the required thing instead of traditional undergrad, since the cost and length is more comparable.

          • monknomo4d

            associate degrees exist

            • bobthepanda4d

              they exist, but at least employers do not seem to think they currently cover enough to accept them in lieu of a bachelor's.

              in fact i would say I don't know how much additional value an associates' holds in that context, and maybe we just merge the two concepts. I haven't ever really seen a posting that is okay with just an associate's.

      • Ekaros4d

        More productive or more extractive. Managerial class has the degrees and they pay themselves and each other higher wages. A lot of production is done by those under them... Are they actually more productive or just lot more extractive?

    • whateveracct5d

      This is a bit black and white. Or maybe just cynical :)

      It's inarguable that bachelor's degrees had real vocational utility in the last few decades (despite the memes).

      A variety of degrees from a state school (with in-state tuition) would lead to good, white collar employement along with modest loans (if any) so there was clear ROI.

    • KurSix4d

      The bachelor's degree has become this strange hybrid of outdated prestige signaling and a vague promise of employability, but without delivering on either in a consistent way.

      • jeremycarter4d

        When you put it like that it really is a bad product.

    • timewizard4d

      > we decided the working class should take out crippling loans to pay for this degree

      Historical point you've completely missed. You used to be able to declare bankruptcy and discharge this debt until very recently. This was not "decided" this was a trap sprung on the working class at a time when blue collar labor was almost entirely off shored.

      > and just forgiving the debt only will make it all more expensive

      That's why you discharge the debt instead. This is an institutional failure. They knowingly pumped a market at tax payer expense. The hair cut is theirs to take.

    • account-53d

      I couldn't agree more with this. I'm a mature student who left school with nothing, joined and left the military with nothing, and managed to get a relatively decent paying public sector job. I decided to put myself through a bachelor's degree part time and self funded. I now have one, it's a nice piece of paper, but the skills I gained from it I likely could have got a lot cheaper on udemy. In fact udemy is very likely to be more up to date.

      I obviously think the computer science degree is full of useful skills unlike a lot of degrees, but it's nothing I couldn't have got elsewhere.

      With a kid in university I am torn because I know that having a degree actually means nothing in the job market, not like it used to.

    • jongjong4d

      Agreed. Higher education mostly became a pretext to justify why certain people were entitled to receive higher salaries in exchange for lower experience and lower quality output.

      It's likely also partly why the west moved away from contract-based knowledge work and towards full-time work... It decouples the product of one's labor from its costs (salary) and it makes managers' lives easier as it gives them much more flexibility and less pressure with regard to how they measure and rank employee productivity. If costs were proportional to output, managers would pay a lot more attention to efficiency as their paychecks would depend on that efficiency.

      Instead, managers' paychecks and career prospects are mostly based on political nonsense like how many people are working under them; literally creating an incentive for inefficiency.

    • borntoolate5d

      Maybe lower education should just have a different schedule with other activity years? I'm not particularly impressed with the average American's ability to be a positive element of society and despite all the problems, I think liberal arts students are probably better than the rest when considered over their lifetime. But why should each individual take loans to have the critical thinking to vote in the interest of larger institutions?

      • JamesBarney4d

        Does liberal arts teach critical thinking? Do students who study liberal arts vs a mathematics/engineering show greater improvements on critical thinking tests?

        I get the idea we want a more educated population that can better make decisions. But the biggest way the populace makes poor decisions is they are economically illiterate, and they don't really understand how the government works. We should probably spend more time teaching this in high school and a typical degree spends very little time teaching these subjects.

        • fzeroracer4d

          The populace making poor decisions isn't just economically illiterate, they are increasingly fundamentally illiterate. Teaching economic literacy in school won't matter one bit when an increasing number of people are graduating unable to read even at a basic level.

          But in order to solve this problem you would have to overhaul our educational system and right now we have a party invested in destroying it so that it produces voters more aligned with their groupthink.

          • JamesBarney4d

            If you're arguing we should improve schools to increase rates of numeracy and literacy than I'm in 100% agreement. But I thought we were talking about liberal arts degrees.

          • nradov4d

            While there are a variety of different interest groups that have influenced current federal government education policy, most of them don't seem to be trying destroy it just for the sake of destruction. Instead the policy seems to be based more on the perception that the education system has been co-opted and corrupted to indoctrinate students with values and political philosophies that conservative or right-wing politicians find abhorrent. They see this as so damaging to society that it would be better to have no educational system at all rather than what we have now. I don't agree with this perception or their actions, but there is at least a kernel of truth to their viewpoint. In a politically diverse nation if we want to maintain broad based support for higher education then we need to find a way to keep universities at least somewhat politically neutral.

            • immibis2d

              Many groups that want to destroy things make up excuses for why it is good. Even Adolf Hitler said the Jews were attacking Germany and he was just acting in self-defense. So the mere existence of an excuse doesn't mean very much.

        • immibis3d

          What is your definition of economic literacy?

          • JamesBarney2d

            A basic understand of supply/demand and how many well intended policies can have terrible effects.

            • immibis2d

              I can agree with that. That's very basic and uncontroversial. I asked because terms like "economic literacy" often masquerade "thinks about economics the way I want them to". For example, some people would say that you must believe things like "financial markets are good" or else you're economically illiterate.

              But supply and demand happen in every economy, even those without money and even post-scarcity sci-fi. And unintended consequences are just a thing everywhere.

              Unless "unintended consequences" was shorthand for "if you hurt stock market investors that's bad for the country", of course.

    • NoMoreNicksLeft4d

      >It then became a vocational degree for the working class,

      This may have been some policymakers' intent, but it never really became that. Universities and colleges resisted being seen as mere vocational schools, and refused to modify curricula for that purpose. You'd see all sorts of academic arguments about how teaching them to do a job would be wrong, that they should still be teaching them to "think" and have heavy course loads of liberal arts.

      So parents and guidance counselors may have thought of it as vocational school, while getting the high-class university experience (as best someone of middling academic achievement can manage that) along with the high-class university tuition bills. Well, the guidance counselors didn't get the bills. Nor, in many cases, the parents.

      >It's not sustainable, and just forgiving the debt only will make it

      It's ok. We don't have to sustain it. The demographic implosion is well under way. We are at the point where elementary schools are closing (and everyone's making up excuses for this, so we can pretend that it's for any other reason than a demographic implosion).

      >and less aligned with actual results we desire (useful workers).

      At any point in history, most of the output from workers went (indirectly) back to the workers. If 98% of the workforce did agricultural work, this was because 98% or nearly enough of 98% of food was needed to feed those workers. Same for any product, no matter how tangible or abstract. Sure, there are exceptions... if people worked in a diamond mine it wasn't so diamonds would go to diamond miners.

      If they don't need workers for the relatively small amount of food that is consumed by some small group, if they don't need workers for making garments (or building buildings, or any other thing) for the relatively amount of those products consumed by some small group, then they don't much need useful workers at all. Most of humanity is obsolete, or at least would be considered so if you belonged to a small group like that.

    • elric4d

      > Bachelor Degrees need a complete rethink

      Maybe. But what I think would help more is a broader focus on lifelong learning. Both on the job on company [dt]ime, and outside of work.

      In many places you can take evening classes for languages, or hobby-classes like embroidery or goldsmithing. Even programming. But there are very few places where you can take maths classes after hours.

      There's also, imo, way too much focus on degrees and not enough focus on competences. Some time ago I tried to register for some postgrad level classes, without any intention of using them for a certificate or whatever, but I was rejected for not having the right degree.

    • dionidium3d

      In order to rethink Bachelor Degrees, one must first rethink high school. It is routine in the US to see schools where 1) <=5% of the student body is proficient in math; but also 2) the school has a 90% graduation rate.

      If that's high school, then it's useless, both as a signal, but also just because, you know, nobody is learning anything. You pretty much have to have some other place for smart people to demonstrate that they're smart.

      • SkyBelow3d

        We need to admit schools are babysitting kids and not teaching them anymore. A few kids can still learn, but so many others don't. Especially the one who would learn in a better environment, but whose class is disrupted by 1 or 2 students preventing their education. Once a student gets behind a year, they aren't going to ever catch up if they are only passed on to the next year instead of being identified as someone who needs to repeat the year.

        Edit: I should have been clearer in "are increasingly babysitting" and not been as strong as indicating it as some universal truth. I hear horror stories from teachers about how much of their time is focused on classroom management, how little on actual education, and how much effort it put into processes so the grades stay up regardless.

        • elric3d

          I wonder if alternative forms of education (like Waldorf/Steiner) would make more sense to more kids. It's clear that the standard way of teaching doesn't resonate with many kids, and we would do well to investigate that.

      • elric3d

        A few essential skills that I feel schools ought to teach:

        1. Personal knowledge management

        2. Digital literacy

        3. Calling bullshit in the age of big data (this is an actual course, freely available on youtube)

    • ransom15384d

      Well, these days you will need a non repetitive physical skill. ER doctor, electrician, comedy tour guide, painter, etc. If your job required you to sit in front of a computer, your job will be the first eliminated. Humans are quite cheap for physical labor compared to a full humanoid drone. Software costs are about to go to zero. Bachelors degrees would turn more into guilds.

      • tempodox3d

        > Software costs are about to go to zero.

        Not all software is like webshit boilerplate.

    • AtomBalm4d

      Now that we have AI tutors that cost <$1/Mtok, why not do on-demand standardized testing for academic credentials and eliminate compulsory (dejure or defacto) education beyond working age. Universities can focus on research, and normal people get off the credentialism treadmill or focus on as-needed training at minimal cost. Need the social environment? Libraries or community centers can lend vacant space for “college” classrooms.

      Everyone wins, right? You know, except university administrators and lenders.

      • poincaredisk4d

        Great, let's make the average person even dumber. Or rather: dumb again. Universal education is a big win of the modern times and you want to destroy that for on-demand job training? Mass producing replaceable workers should not be humanity's goal.

        • martindbp3d

          How would it make people dumber? Students provably forgets 90%+ of what they learn, and education is massively inefficient today, which can also provably be improved by virtual tutors (see Bloom's two sigma problem). Your reaction seems knee jerk instead of thinking through this from first principles. Something like this is likely to happen whether you like it or not.

      • martindbp3d

        Exactly. I think the people who oppose this do it for sentimental reasons. Either because they want the prestige of a place like Stanford on their resume, or they enjoyed the social milieu. A social network of similarly ambitious people is also arguably more important than the actual knowledge. The knowledge is already available for free online, and like Bryan Caplan points out: at many universities you can just go and sit in class, but nobody does it because what you actually want is the credentials, not the knowledge.

    • dkkergoog5d

      Education can't be standardized because business is not standardized. It's about relationships and every shop does things their own way. Teachers should just go contract and provide their own courses. The degree is really can't designate what the person knows.

      • bluGill4d

        There is a lot of value to standardizing things though. Most of what people need to learn is standard. 2+2, i before e... There is the more nebulous step of can you solve problems, but even that is taught at the lower levels in a standard way.

        Eventually you get to a level where the answer hasn't been taught much and so there is no standard, but that only happens late in your education.

        Of course education itself needs more research. "learning styles" was debunked last I heard, but is there some variation that says some people need different teaching? Is there a better way to teach things? However if you are not involved in this research you should be teaching in standard ways that have been proven to work.

        • a2tech4d

          I before e isn’t even standard.

          • bluGill3d

            English needs to be standardized / simplified /reformed. I don't know how to make that happen.

      • kelseyfrog4d

        That's right. Education should be universalized, not standardized.

        The commodification of education will eventually make it useless. Universal, bespoke education is the ideal, but it's usually prohibitively expensive.

        • bluGill4d

          I strongly disagree. Commodification of education is very important. I want everyone to have the commodity reading writing arithmetic education. It might be faster for the student to give one on one teaching, but everyone needs this and I want it such a commodity that we give it to even the poorest "third world" kids who have nothing.

          As you get beyond those basics you have to become an expert in something that there are few experts in. McDonald's success is mostly based on their ability to take you from zero to a productive crew member fast. It takes many years to make a brain surgeon, in large part because we haven't commodified it - if we needed more we should (but since the world doesn't really need many it isn't worth it)

          • kelseyfrog4d

            Commodification denies a universal pedagogy. We've known this since the 1970s.

            If you want a system that appeals to a plurality, and leaves the rest behind, commodification will deliver just that.

      • mejutoco3d

        On the other side, pallets and standard shipping containers exist.

    • MITSardine4d

      Or is it not a more recent trend that people should educate themselves, rather than an employer doing that? Internships have become very short and secondary, when they used to be the main vocational training. It's curious that this has remained only at the two ends of theoretical studies: artisanship and scientific research (PhDs). Otherwise, job training has become the employee and, where university is public, society's burden. Maybe it's employers that need a complete rethink, not so much universities?

  • ferguess_k3d

    What I worry a lot more instead is how knowledge of manufacturing and engineering could be lost due to our greed.

    Typical scenario: Industry I is not doing fine in country C (i.e. the fund managers are not happy about lack of growth of the public companies in this sector) due to reasons R1, R2, ..., Rn. Then management decided to outsource and eventually dismantle the factories to "globalize" it. Knowledge retained by the older generation of engineers, technicians and workers were completely lost when they passed away.

    • al_borland3d

      I think about this often and it bothers me a lot.

      I think the same can happen for knowledge work. Country A ends up turning into a bunch of managers outsourcing to Country B, and then at some point Country B realizes they can manage themselves. Companies are quite literally training their future competition. Once it reaches a tipping point, I don’t think it’s easy to reverse.

      • cultofmetatron3d

        > Country A ends up turning into a bunch of managers outsourcing to Country B,

        why the need to be hypothetical? this is more or less how it played out between china and america. America outsourced all manufacturing to China while doing all the R&D and "innovation." one look at companies like DJI, BYD and Bambulab and its clear that china can innovate just fine on their own now. Their products are becoming objectively better than the US designed ones.

        • losvedir3d

          And Japan before that. My mom told me that many years ago "Made in Japan" had the stigma that "Made in China" has (or used to have). But growing up with Nintendo and Sega and with Hondas and Toyotas being the best, I always thought of Japan has a high tech manufacturing hub.

          • badc0ffee3d

            This was a joke in Back to the Future - the people in 1955 associated Japan with junk, and 1985 Marty says all the best stuff comes from there.

          • thwarted3d

            There was a reference to that in Back to the Future 3, the difference between the way Japan was viewed in the 50s compared to the 80s.

            Doc: No wonder it failed, "Made in Japan"

            Marty: What do you mean, doc? All the best stuff is made in Japan.

          • bayindirh3d

            The thing is, Americans brought the idea of constant, small improvements to Japan. Japanese took the idea of kaizen and ran amok with it. Turns out kaizen is very fitting to their craftsman mindset.

          • swsieber3d

            Reminds me of something I read in a sermon published oh (checks date) 24 years ago:

            "More than thirty years ago I stood beside the desk of Professor Richard Rosenbloom, who taught courses in manufacturing management. In those days they called the field 'production.' I was a research assistant, and Professor Rosenbloom had just stood up to welcome one of his students into his office. The student, of medium height, was dressed in a dark suit and tie. He stood before the desk, bowed deeply, and handed to his professor a beautifully wrapped gift. He had completed his studies and was returning to his home in Japan.

            "The professor murmured thanks and then, to demonstrate his appreciation, he unwrapped the gift. It was a black fountain pen with gold trim. He sat down at his desk, took out a piece of paper, filled the pen from an ink bottle, took the pen in his hand, and began to write. The student beamed. But then he looked stunned, as we all did—the pen was not writing. Professor Rosenbloom pressed harder. The student frowned more deeply. Nothing. Professor Rosenbloom tried swirling the pen. Still, no ink flowed. Finally, in exasperation, the student reached across the desk, grabbed the pen, shook it forcefully, and said with great feeling, 'Cheap Japanese pen!'"

        • raducu3d

          > china can innovate just fine on their own now.

          If China could turn democratic and open, the World would have such a bright future.

          Alas, I think we're headed into some distopian future because China is not going that way but indeed, they're very likely to surpass the USA in every other way.

      • pca0061323d

        I feel like top management and share holders don't really care about things that may happen ~30 years later. And even if they care, there are a lot more other issues that may ruin a company, e.g. the entire industry is replaced due to technological advancement.

        • al_borland3d

          It's not necessarily the fault of the company. The government created the environment to make it the only way to stay competitive for a lot of companies. Some bucked the trend, and I respect them for it. There is a push to bring things back now, but it's going to be ugly. It's much better to avoid the issue in the first place like most countries, but it's too late for that.

        • ferguess_k3d

          They definitely not, and there is no safe valve for it.

          Can't blame them though, this is what Capitalism teaches all of us. I'm short sighted too, but the thing is I won't be able to go wherever I want when SHTF while fund managers can probably do so.

      • raducu3d

        > Country B realizes they can manage themselves.

        That's a thing we were taught in "Civic Culture" in grade 5 -- that slaves can become masters through hard work and making the master dependent on them.

        But now I see it more in a figurative way, because it rarely happens with actual slaves, but in a more metaphorical way, it certainly can happen.

        • sawaali3d

          It did happen in the Muslim world. There are "slave" dynasties in many parts of that world (Egypt, India) which were robust dynasties borne out of freed slaves.

        • auggierose3d

          Denzel also delivers this lesson in Gladiator 2.

      • treis3d

        I don't think that's ever really happened. If you look at GDP per worker the closest anyone has gotten to the West is Japan & South Korea at roughly 3/4 the production of France.

        They can take over existing industries like autos because they do it for cheaper. They can compete in niches for innovation. But they are all capped by structural issues that will prevent them from matching the economic innovation of the west.

        • ferguess_k3d

          I think GDP is a biased indicator when states are competing with each others. Industrial output makes more sense IMHO.

          • treis3d

            Not really because actually making stuff isn't as valuable as engineering stuff (or other service-like activities).

    • kjkjadksj3d

      This is a huge deal in some industries even today. Such as film photography. No one makes 35mm cameras although it would be all too easy to rehash a good 90s point and shoot model and have it sell like hotcakes (see prices/hype on olympus mjuII). No one even makes or services lab scanners anymore; all those noritsus and fronteirs for pro lab scans are approaching 30 years old and nothing is even close on the market or could even catch up to what went on in terms of R and D building these machines for a seriously profitable industry at the time.

      Film has been getting increasingly popular. Local film labs are busy and new ones are opening. It is the vinyl record of imaging. Yet despite this, kodak and fujifilm have responded as most lack of forsight businesses do and cut film stocks, raised prices, constantly trying to squeeze more blood from the stone which goes on to kill growth in the industry vs supporting it and fostering actual growth and more profits more than 1 quarter out.

      • whizzter3d

        There is always retro enhusiasts of "obsolete" tech/methods, people knit despite there being cheap cotton clothes, people tinker with 50s cars despite everything.

        While there is tons of money to be made for niche-enthusiasts, these niche's aren't always large enough to properly re-industrialize (chemical regulations, expensive machinery,etc).

        • kjkjadksj3d

          Retro enthusiast solution in this case is a 3d printed can you attach to your lens and its on you to revert and grade the color in lightroom. Better hope the camera is perfectly aligned to the negative that is perfectly flat and you have a lens that can focus a flat field free of distortions. It is a far cry to the engineering that went into these pro lab scanners. Just the color science alone people struggle to replicate, resolution and sharpness throughout the frame as well, speed of handling for lab setting is second to none. People still pay a premium for these scans on these particular machines.

          Kodachrome homebrew efforts don’t get close either. The process is too bespoke and reliant on instruments that no longer exist. Even proper c41 chemistry is down to just when the full kodak kit is in stock and not backordered by desperate developers: those instagram brands kit contains blix and are inferior as a result to a separate bleach and fix kit. Fuji press kit the old alternative is hard or impossible to find.

          And no one is making new cameras. Only fixing old ones and only popular models able to be fixed and worth the time creating a secondary market of parts for (so basically Leica and pay for that or be relatively SOL to varying degrees for most else).

          The biggest issue is this stuff was created when film was a global industry. tens of thousands of engineers were working on every step of the process for decades. That is gone now. All that old knowledge and learnings mostly gone because these companies kept poor records and destroyed old obsolete notes and material. Not just how the camera works but how to manufacture it and everything else at scale or at least in a way that it is actually profitable. And perhaps it could never become profitable without efficiencies brought on by scale and massive investment in manufacturing. And it would actually be forever be lost once all this equipment we still have falls apart.

      • dukeyukey3d

        > No one makes 35mm cameras although it would be all too easy to rehash a good 90s point and shoot model and have it sell like hotcakes

        Quite a few new 35mm cameras are being developed and sold. From the budget Long Weekend, the uber-pricy Leica M6, and the mid-range Pentax 17. Even some half-frames are coming out, including from Kodak.

    • mschuster913d

      > Knowledge retained by the older generation of engineers, technicians and workers were completely lost when they passed away.

      That has happened quite often in the past [1].

      [1] https://en.wikipedia.org/wiki/List_of_lost_inventions

      • aaronax3d

        Only three real examples on that page.

        Petrification: no real loss to society but I will concede that this appears to be an example of actual loss of "progress". Very few people are saying "oh diety, if only I could preserve this animal with perfect color and texture (but discarding all other characteristics).

        Greek fire: the history of when this knowledge was lost is unclear. Also it was intentionally kept secret. Certainly we have functionally similar tech now. I'll give 50% credit.

        Panjagan: We don't even know if this was a weapon or a technique? Everything about this is incredibly vague.

        In summary, this has happened approximately 1.5 times in the past. Not "quite often".

        • dghlsakjg3d

          That list doesn't seem exhaustive in the least. A modern example is Fogbank, a classified nuclear weapons material that we recently had to reverse engineer to successfully rebuild certain warheads.

          https://en.wikipedia.org/wiki/Fogbank

          Edit: I see someone beat me to the punch with Fogbank. For another example, look to the F1 rocket engine that took us to the moon. Despite having the actual engineering documents, we just don't have the manufacturing skill to rebuild one: https://apollo11space.com/why-cant-we-remake-the-rocketdyne-...

          • Aloha3d

            There is another one where they had to recall retirees to restart Stinger Missile production.

            The F1 is particularly interesting, because if we'd undergone a program in 1985, we probably could have restarted production, maybe even 1995 - but every decade another n% of the engineering knowhow sluffed off into the permanent dirt nap until there wasnt enough left.

            Another recent example was when Jay Leno had a new heat exchanger made for his Chrysler Turbine car, and they were able to call some retirees in to help make a new one.

            • mschuster913d

              And that's why the US is constantly at war in one theatre or the other. It's incredibly expensive to keep production lines going - but it's vital because if you don't, you can't scale production up at a moment's notice.

              Us Europeans learned that lesson the hard way three years ago and we haven't made much progress ever since the first Russian boot set foot on Ukrainian soil. 100k artillery shells don't sound like much of a thing... but apparently it is.

              • Aloha1d

                No, but it is why we keep making tanks, even though the odds of a land war needing large number of tanks is tiny.

        • __turbobrew__3d

          Fogbank was a famous case where we forgot how to make it: https://en.m.wikipedia.org/wiki/Fogbank

          Similarly expertise in plutonium manufacturing has been lost, and now that countries have run out they are re-learning how they did it 40 years ago.

        • mschuster913d

          There's also the still open question on how the pyramids of Egypt were built, and AFAIK NASA lost the knowledge how to build some stuff as well. And the recipe for Roman concrete was also long lost, with researchers being able to reverse-engineer what it likely was only a few years ago [1].

          [1] https://news.mit.edu/2023/roman-concrete-durability-lime-cas...

      • kevinsync3d

        I had never heard the story of Sloot Digital Coding System [0] linked from that Wikipedia entry. Truly not trying to be judgmental or negative but I was exhausted just reading the article, let alone the idea of chasing that particular albatross as my life's pursuit lol.

        Really fascinating bit of trivia though!

        [0] https://en.wikipedia.org/wiki/Sloot_Digital_Coding_System

    • ge963d

      Didn't that already happen with US to China from the 70s

      • dudefeliciano3d

        german automakers are experiencing this right now

        • aleph_minus_one3d

          The problem with the German automotive industry is in my opinion different: they developed cars which are not different from the kind of cars that many people in their market want.

      • tempodox3d

        Those who cannot learn from history are doomed to repeat it.

        • makeitdouble3d

          Shareholders in the 70s made a lot of money. They'll be OK to rinse and repeat that history.

          • tempodox3d

            True. And the people selling shovels for the “AI” gold rush are making boatloads of money now. But it's still up to us whether we buy their customers' products.

          • __MatrixMan__3d

            That'll only work if somebody is willing to do the job for a lower price. The US may be in a different position the next time around.

            • trashtester3d

              Somebody, or SOMETHING.

              There will not be much work that cannot be done by Figure, Optimus, Atlas, Claude, Grok or GPT by 2035.

              • ozfive3d

                We've heard this for many decades. I'm sure we will be hearing it for decades more...

                • gaze3d

                  It doesn't really matter if they can or can't do the work. If enough execs are convinced AI can replace the workers, they will replace the workers, and management will have moved onto other things before they suffer consequences.

                • __MatrixMan__3d

                  I think we'll reach "most work done by unattended robots" around 2070. But we'd be better off if started working on post-labor-scarcity economics ASAP--might as well start learning to swim before the ship sinks. It might even be fun.

    • jayd163d

      What is the scenario exactly? The knowledge is proprietary and thrown away? And also the worker never wrote or used the innovation elsewhere despite it not living in a patent?

      • amiga-workbench3d

        There is such a thing as institutional knowledge, you build it up by retaining staff properly, training replacements and handing this knowledge down, you need continuity. Another important part of this is doing a thing often. For example, if you halt manufacturing of a widget and then want to restart manufacturing it two decades later, you are going to run into every single snag and problem found during the first iteration of the process, you're starting from scratch and you might not even be able to actually replicate the original process properly.

        The effect even applies on a larger national scale, where if a country stops investing in infrastructure projects for a long period, they will find themselves incapable of executing these projects properly in the future.

    • DAGdug3d

      Some about being pedantic, but what’s the value of I,C, R1-Rn here? Seems like a distraction!

  • apercu4d

    I'm guessing that the people who most espouse the virtues of AI do not "test" the output much and just let LLMs pump out errors.

    I use LLM's daily, but as a tool to brainstorm, mostly, or to write small parts of scripts (e.g., shell, not TV shows). But everything has to be verified.

    Last weekend I was using ChatGPT Music Teacher (or, trying to anyway) to prep some voice leading practices for guitar. I spent almost a half hour trying to get that model, then the base ChatGPT model to give correct information about inversions and the notes in the chords. It was laughably wrong over and over again.

    It would misidentify chords, say that a chord had the base attributes of a triad (tonic, third, fifth) while giving me a chord shape that had the root twice, and a third, and calling that a second inversion. Or giving incorrect fret/note information.

    If I didn't know theory and how intervals work on a guitar I would have been pretty screwed.

    As it was, I wasted a half hour and never got anything usable.

    I'm not saying that the technology isn't fairly amazing, but like, don't believe the hype.

    • LeftHandPath4d

      The most I've done with it so far has been asking what a nice database layout might look like for a given problem, copying the output to a markdown file, and then referencing it as I personally design the schema.

      LLMs are pretty cool, but they make all sorts of unpredictable errors, and - especially when you're using them for things you're unfamiliar with - introduce all sorts of hidden costs you won't catch until later.

    • unclad59684d

      I've wasted so much time trying to get LLMs to help me code. One issue I have is that I can never seem to get the AI to say the word no. No matter what I ask, it will say "Absolutely! You can solve [impossible problem] like so...". At this point I basically use them as documentation search engines. Searching for things like "does this library have a function to do thing?". Gemini and deepseek seem to be good enough at that.

      I've entirely given up on using LLMs for exploratory exercises.

      • toxik4d

        Or the old asking a question Q1, getting wrong answer A1, explaining why it’s wrong with Q2, getting answer A2 that hyper-focuses on Q2 and misses important parts of Q1, restating Q1 you again obtain A1, repeat ad nauseam.

    • kjkjadksj3d

      Its because chatgpt does not have the right answer. It has a trove of old forum posts with those keywords strung together in them and guesses what word is liable to be next based on the dataset.

      You can see how this is more like listening to a crowded room and relaying random words than actually learning through understand ideas and building off of them.

  • hgs33d

    The vast majority of jobs that sustain our standard of living are blue-collar: farmers who grow our food, textile workers who make our clothes, construction workers who build our homes, plumbers, electricians, waste disposal workers, etc. I'd say it's white-collar work that became overinflated this past century, largely as a reaction to the automation and outsourcing of many traditional blue-collar roles.

    Now, with white-collar jobs themselves increasingly at risk, it's unclear where people will turn. The economic pie continues to shrink, and I don't see that trend reversing.

    It appears to me that our socio-economic model simply doesn't scale with technology. We need to have a constructive conversation about how to adapt.

    • rthomas63d

      The way I see it you only have two real choices:

      1. Raise wages to match global increased productivity

      2. Democratize ownership

      That's it.

      • mentalgear3d

        Democratize ownership it is.

        Imagine an early human group of 40 people. If one person hoarded the food of 37 others and employed the remaining two just to guard it, it wasn't long before the ruse was up and there was a revolt and the food reclaimed.

        Now, only because of scale and abstraction, basically the same setup is possible (0.1% owns as much as 50% of the population).

        Our time is perverted ownership-wise, and it's time to go back to a truly cooperative society.

        Cooperation, not hoarding, was the foundation of the beginning of civilisation.

        ---

        My ideal future resembles Star Trek: a world where money is obsolete (at least on Earth), and people pursue exploration, science, and the arts purely out of passion and curiosity.

        A society driven by innovation, not profit.

        • lumenwrites3d

          I don't think the world where a mob of people can gang up on a person and take their stuff is as idyllic as you think it is. If the person who has figured out how to earn a lot of food doesn't get to "hoard" it, it'll just get hoarded by a person with the biggest stick.

          What's worse (for the society), is that in this world nobody has an incentive to create wealth, because they know it'll just be taken away. When rich people aren't in power, people with political capital and big guns are. I don't think that's better.

          If AGI takes over, that changes things, somewhat. If it creates unlimited abundance, then it shouldn't matter who has the most (if everyone has plenty). Yes, it would create power disparity, but the thing is, there'll always be SOMEBODY at the top of the social hierarchy, with most of the money and power - in the AGI scenario, that is someone who is in charge of AGI's actions.

          Either it's AGI itself (in which case all bets are off, since it's an alien god we cannot control), or the people who have developed AGI, or the politicians who have nationalized it.

          Personally, I'm uncomfortable with anyone having that much power, but if I had to pick the lesser evil - I'd prefer it to be a CEO of an AI company (who, at least, had the competence and skill to create it), instead of the AGI itself (who has no reason to care about us unless we solve alignment), or one of the political world leaders (all of whom seem actively insane and/or evil).

          • anigbrowl3d

            If the person who has figured out how to earn a lot of food doesn't get to "hoard" it, it'll just get hoarded by a person with the biggest stick.

            Where are you getting the 'earn' part from, and why isn't it in scare quotes like 'hoard'? It seems like you're just changing the parameters of the argument to support the conclusion you prefer.

          • sjducb3d

            Most hunter gather societies have big differences in productivity between members. I remember reading one example where one man did all of the hunting for a tribe of about 40 people. He really enjoyed both the hunting and the status of being the best hunter. He shared the meat freely. No one was taking it away.

            • lumenwrites3d

              Nothing about the current system (capitalism) prevents people from sharing freely, that's just charity. I think it's wonderful and admirable when people do that, and I fully support that, as long as it's voluntary.

              I'd be happy to live in a version of society where there's enough abundance and good will that people just give to charity, and that is enough to support everyone, and nobody is being forced to do anything they don't want.

              I only dislike it when people advocate for involuntary redistribution of wealth, because it has a lot of negative side effects people aren't thinking through. Also, because I think that it's evil and results in the sort of society and culture where it would be a nightmare to live in.

              • ido3d

                Isn't "involuntary redistribution of wealth" literally every country on earth though (aside from a few that have such a lack of rule of law that the state can't tax the population)? Do you consider the entire developed world (and most of the rest) a nightmare to live in?

                I live in Germany where we have taxes & don't consider it a nightmare.

                • lumenwrites3d

                  I think it's a gradient. When I think about the "nightmare to live in", I think Soviet Union or North Korea. Those are the places who went all-in on redistribution.

                  Most western countries mostly respect individual freedom and property, taxes being an exception to that, somewhat limited and controlled. I see that as a necessary evil - something we can't fully avoid (at least, I can't figure out how we'd do that), but should try to minimize, to avoid sliding down the spectrum towards more and more evil versions of that.

                  I think most western countries are nice to live in because they do comparatively good job at respecting people's freedom, property, and the right to keep the stuff they earn.

                  Advocating for more redistribution is taking steps away from that, in the direction people don't realize they don't want to go in.

                  • andrekandre21h

                      > Advocating for more redistribution is taking steps away from that, in the direction people don't realize they don't want to go in.
                    
                    with shared ownership (e.g a cooperative business) there isn't a forced redistribution in the first place, i think thats the point of the original poster?
                • dragonwriter3d

                  > Isn't "involuntary redistribution of wealth" literally every country on earth though (aside from a few that have such a lack of rule of law that the state can't tax the population)?

                  Places with governments that weak also tend to prominently feature involuntary redistribution of wealth. It tends to be more self-service at the hands of the end-recipients and without the kind of ethical theory behind it that is at least the notional framework fo redistribution by functional governments, but it still very much occurs.

              • sjducb3d

                It’s not as involuntary as I made it sound. I think if he decided not to share the meat then he would have problems with the rest of the tribe.

                • ido3d

                  you also can't eat more than so much meat anyway and it spoils at some point (especially in a society without electricity/refrigeration).

          • canadaduane3d

            > What's worse (for the society), is that in this world nobody has an incentive to create wealth, because they know it'll just be taken away.

            I know this has been discussed at length in many places, but I just want to point out that it isn't binary. There is some kind of distribution where "sovereign ownership" (full protection, no taxes, no redistribution) would entice the most people to create wealth (and even then, I doubt it would be 100% of the population), all the way to "mob rule" where a minimal number of people would be enticed to create wealth (and I don't think it would be 0%). People do things for multidimensional reasons.

            That said, our societies have tried many variations along the spectrum between these two extremes, and I think we have uncovered the importance of protecting wealth and the incentive to create it.

          • tucnak3d

            > in which case all bets are off, since it's an alien god we cannot control

            I hate to break it to you, considering how much effort you put into your comment, but this is already the case. Global economics is something we CANNOT control already, so the world you live in, is ALREADY governed by alien God. The self-described "optimists" here are naive at best, and delusional at worst.

          • neutronicus3d

            Abundance is going to be limited by raw materials.

            The less need there is for human labor, the less disincentive humans have for killing each other over raw materials.

        • ngneer3d

          There is no sequence of steps that takes us from where we are to the society depicted in Star Trek, or at least none has been outlined so far. If it were to happen, the world would need an abrupt phase change (e.g., First Contact). You may be tempted to call me a pessimist, but I am a realist. To convince a realist, one must show a sequence of steps.

          • arbot3603d

            Star Trek outlined a sequence of steps but you might not like them...

          • SketchySeaBeast3d

            I think you're dead on. Growing up I hoped for a Star Trek future. Now all I see is Alien without all the xenomorphs.

            • IanHalbwachs3d

              This comment may haunt me for the rest of my life.

              • SketchySeaBeast3d

                Well, if we're lucky we might see a xenomorph or two before we go, possibly even while screaming.

            • ido3d

              Or Blade Runner.

          • canadaduane3d

            > or at least none has been outlined so far

            I love this, thank you for specifying the condition under which you might be convinced otherwise. I respect your position more for this. (And unfortunately as a Star Trek fanatic myself, agree with it so far as well).

          • rthomas63d

            In the Culture books it was AI...

          • tiberious7263d

            Step 1) world war 3

            Step 2) some drunk invents FTL

            Step 3) the Vulcans show up

        • phkahler3d

          >> My ideal future resembles Star Trek: a world where money is obsolete (at least on Earth), and people pursue exploration, science, and the arts purely out of passion and curiosity.

          The problem with the Star Trek fantasy is that it's a lie even within the show. There are still people with obligations. There are still treaties and trade. There is still a hierarchy where some people have more important jobs and report to others. The only thing that seems different is that no money changes hands.

          If we actually automated everything so no one had to work, there would still be a gradual progression and a time when some people have to work while most don't. I'm not sure how you get through that phase, not to mention what it looks like on the other side.

          One thing I think should be considered is not so much how to pay everyone "enough" but how to reduce cost of living so more people can work less, or eventually not at all (somehow without owning everything and amassing control).

          • HPsquared3d

            We already live in a world where only a small number of people ACTUALLY need to work. Most of humanity's expenses are attributable to lifestyle creep. If we could be happy with a 16th century peasant standard of living delivered by a small number of people using modern tech, only a tiny percentage of the available manpower would be required. Lifestyle creep eats all gains in productivity. Work also expands to fill the available time.

        • bumby3d

          I like that ideal, but it only comes about in abundance. Unfortunately, I think humans are programmed to rarely feel a sense of abundance because we innately desire social status. Social status is a relative metric, meaning it only exists in relation to others. This, combined with a greed impulse, renders a constant need for more. In other words, the human state often runs counter to a sense of abundance, and this seems incompatible with that ideal. I think that’s why capitalism, warts and all, has been an engine for progress.

          • JanisErdmanis3d

            The core issue seems that the social status is tightly coupled with currency and it can be optimized in destructive ways. Perhaps we need currency with multiple colours that could be earned in different ways and could be allowed to be spent to its sphere of influence as long as it could be enforced to not be interchanged. That way one could have higher social status in one sphere of influence and lower in another one.

            • bumby3d

              William Storr has written about this and gives three different styles of social “currency”: dominance, virtue, and success.

              Capitalism favors success and dominance over virtue, but some social subsets (eg clergy) get status through virtue.

              • JanisErdmanis3d

                I may have a look on William Storr later on. The separation of social currency does however seem rather primitive one.

                • bumby3d

                  Primitive in what sense? Lacking nuance, or not reflective of modern society? I do think that the success category is easily mapped to money, which may be why it’s a primary heuristic for measuring success in a modern western society.

                  • JanisErdmanis3d

                    I aggree that money or the ability to attain it is what modern society considers a success. However, the categories seem to be a reflection of the universal nature of the money that can buy any influence/right.

                    If we had a hyptothetical coloured currency where one kind could only be spent buying property, other necessities, some luxuries like travels. As long as the currencies could be enforced to not be interchangable the "dominance" becomes realative with respect to influence points one has collected.

        • K0balt3d

          Worse yet, we are moving away from money but towards power being the goal. Automation stands to fundamentally upend the whole cart , destroying capitalism as we understand it today.

          Money exists fundamentally to motivate people to perform some action. In the end, it comes down to paying labor. No one pays the earth, the forest, the sun.

          Automation stands to make human labor too expensive to compete, since our needs far exceed the energy it takes to operate and manufacture us.

          Without human labor, you don’t need money. You just need more anthropoid robots to make more anthropoids and to execute on whatever it is you want to do, own, or construct. You don’t buy a yacht, you build a one-yacht factory to make a yacht.

          Capital will become human-sparse, needing only resources and energy and will exist to serve a small group of beneficiaries, if anyone.

      • mhuffman3d

        >The way I see it you only have two real choices:

        Unfortunately, due to the way politics and money work in the US, we have zero real choices...

        >1. Raise wages to match global increased productivity

        This is blocked by our two-party political system. Openly by Republicans (at the moment) and practically by Democrats. Both of our political parties seem to serve the people that actually pay them (ie. rich business owner donors) vs. people that vote for them. At least if it comes down to a problem between the two. This is shown in sharp relief in the famous chart by Martin Gilens showing which laws get passed in the US based on interest group.

        >2. Democratize ownership

        This is blocked by actual owners. Money equals ownership in the US. So, perhaps we can use our dollars and spend a certain way, one might think. Just BlackRock, Vanguard, Fidelity, and State Street combined have more assets under management than the entire United States GDP, and they are just 4 companies. Good luck!

        >That's it.

        That is probably not it, but if it is we are fucked!

      • hn_throwaway_993d

        The general folly I see in these types of discussions is that people believe that we must fix these things because in the mid 20th century we reached a relatively great state with broad-based prosperity (at least in the West), and so obviously we'll fix our problems because otherwise we'd backslide to a worse state.

        Sorry to be a pessimist, but progress is not guaranteed. I see the mid twentieth century as largely an anomaly in human history. Going forward, I see wealth concentration continuing to accelerate, with a widening gulf between classes that control the means of production and everyone else, which due to technological advancement will make a lot of people's labor much less profitable. Basically, a reversion to a more feudal system, where there is essentially an aristocratic class that hoards and lives off its previous wealth, and pretty much everyone else living at a subsistence level. Think Ireland in the early 1800s (not necessarily that level of absolute deprivation, but same level of relative deprivation compared to the land owners).

        The reason I see this as the most probable outcome is when I hear people talk about "raise wages to march global productivity" or "democratize ownership", I don't see any rationale as to how or why that would happen. Do you think the people in control will just give that away from the goodness of their hearts? There is literally no economic or social reason to expect this to happen. It's clear that democracies can be successfully manipulated into "blaming the boogeyman", so I don't think the democratic process will bring about these changes.

        Happy to hear a rational argument to the contrary, but my primary point is that I rarely hear any argument about how we get to there from here.

        • canadaduane3d

          I agree with your points over all, but lacking a complete "rational argument", I'd just like to outline a few ideas that I'm still working on, and while not a complete fulfillment of your desire for a map from here to there, might be a starting place for ideas. Like you, I see the seeds of a potentially dark future--but maybe it isn't our fate just yet.

          I'd start with changing what and how we measure. A move away from single-dimension variables like GDP and simplistic closed-form calculations like the Black-Scholes formula and all it led us to believe.

          If we agree simple-but-wrong metrics are bad, then we can (I believe) move towards simulations--not "my simulation" nor "your simulation", but ways to talk about beliefs and outcomes. I think the future will involve AI-assisted computable discussions, where multiple variables and the ability to dynamically incorporate or exclude assumptions from opposing perspectives will lead us to some shared agreement and mutually beneficial outcomes (while allowing for many areas where people will continue to disagree).

          I'd propose next that we continue to raise the prominence of evidence showing how cooperation is often better than competition. Nobel prize winner Elinor Ostrom spent her life identifying systems and methods of cooperation. She proposes, "We are neither trapped in inexorable tragedies [e.g. of the commons] nor free of moral responsibility."

          Robert Axelrod ran simulations on the iterated prisoner's dilemma and concluded, "forgiveness, cooperation, and reputation" are a stable strategy in most real-world conditions.

          Strong ideologies that promote extreme individualism, marketed as scientifically sound, deserve great skepticism IMO, and should be treated with the same wariness as two missionaries knocking on your door.

        • gosub1003d

          If you accept the statement: "poverty causes violence", and "wealthy people (ultimately) cause others to be poor (which I can admit is a tenuous claim, but I think many people would agree, at least in certain cases), then all "violent revolution/rebellion" is is a redistribution of suffering, bringing it back to those who caused it.

          It's an incredibly murky train of logic though. Most wealthy people have done nothing directly to cause the poor to suffer. But if you examine it closely, you can find links from greed <---> suffering everywhere.

        • ozmodiar3d

          I agree with all of this, and frankly it has me terrified for the future of humanity. With enough AI and automation you don't even need other humans, just the resources to hoard more means of production. The only rational argument I have for how we get somewhere else, is eventually once we start hitting the end game of resource accumulation someone's going to start launching nukes and the destruction of technology will send us back to a time when other humans actually mattered. A bit optimistic, I know.

      • tmaly3d

        > Democratize ownership

        I think there is another word for this.

        read To The Finland Station by Edmund Wilson.

        There are only a very few cases where Democratize ownership worked and it was under a benevolent dictator. After the dictator died, everything fell apart.

        • hgs33d

          The book you referenced appears to be about the rise of Marxism. I'm neither agreeing nor disagreeing with the GP when I say this, but I think it's worth pointing out that you can democratize ownership under capitalism, e.g. a worker cooperative [1] is a business owned by the workers.

          [1] https://en.wikipedia.org/wiki/Worker_cooperative

      • bumby3d

        I think you’re casting too narrow of a net here. Other options include an automation tax, UBI etc. (unless you consider those a subset of your items above)

        • conductr3d

          > unless you consider those a subset of your items above

          It is the same just with extra steps which put politicians and such in control of it all. Which is why something like these really are the only viable tactics to the overarching strategy. Politicians won't allow change that excludes them from power

        • contingencies3d

          Automation tax sounds sketchy. How would that work? All computers have DRM and programming becomes illegal? Unregistered physical automation becomes banned? Man the toolmaker can no longer make tools? What use are a bunch of depersonalized hominids?

          • bumby3d

            It’s been talked about for years, but my lay understanding is that tax is levied on automation that displaces human production. This circumvents the issue where we have a system based on income tax, but a lower ratio of income to production.

            So if a factory used to employ 100 people, who were paid a salary, that salary was taxed generating income for societal benefit like roads and hospitals. But if automation comes in and produces the same with just 10 people, the money from income taxes for societal benefit is reduced by 90%. The net effect is that society may have less money for the collective benefit even as production gets more efficient. An automation tax would make up the difference.

            It’s not altogether different than the “mileage” tax for electric vehicles to displace the gasoline taxes that fund roads. It’s a different tax scheme because the fundamental premise has changed (road use is proportional to gasoline consumption/tax needs are proportional to human salary). Taxes are systems of convention so we don’t need to pretend they must adhere to some immutable physical law.

            To your question about how it could be implemented, I’m sure there’s lots of nuance. But to illustrate it off the top of my head, industries may have baseline rates of per-capita production and if they implement automation to exceed that substantially, that excess production would be taxed. So a craftsman woodworker who makes five items a week wouldn’t be affected, but a cabinet factory making 300 per capita items per week would.

            • contingencies3d

              Thanks for clarifying. Concerns: (1) Looking backwards: "a factory used to". That's questionable, we should look forwards. (2) Proxying. "was taxed generating income for societal benefit". If the net outcome of a factory is to give people money, shouldn't we just skip the factory and give people money? Isn't that more efficient? (3) Suggestions around structure and limits. These will be abused. I'm a craftsman woodworker, but so are my family, we in aggregate produce more than the factory, but we're all under the threshold.

              I'm not sold. Actually, I think there's more to it. There's been an historical association between people's job and their sense of identity and self worth. The value of work is not just about earning money, or the portion of that money that is transferred to social uses through taxation or otherwise. If you just implement UBI there'll be bored directionless people, drug use, criminality and social problems. What we need is recognition that (a) we don't need the people anymore (b) that's OK (c) different people may freely choose between social engagement and hibernating in a room with VR or going full artist hermit mode or becoming a triple PhD or being a psychonaut or whatever.

              The oddball reality is we're sort of there already, it's just not evenly distributed. Ask an anthropologist or an economist or a technologist: what policies should governments put in place to support people and society in a transition to "nobody works or needs to work or engage with one another at all?" Turn them all in to GPS tracked phone zombies? Build sovereign wealth funds? Redefine collective identities in virtual spaces? Slowly introduce methods to reduce fertility? Reject technology and return to nature?

              It seems to me that our ape minds are not well suited to the new reality. Most will seemingly choose to live a reactionary life of experiential consumption in a bubble of consensual hallucination crafted by technology that is controlled by others... and soon, controlled by endless generative AI. As a species we are enslaving ourselves to the perpetual feed through laziness.

              In an era where similar popular experience increasingly lies at every corner of the globe: ask yourself - what kind of life do you want to lead? As technologists, we now arguably have greater ability to alter humanity's course than the politicians. We should think about how to use it.

              • bumby3d

                >"a factory used to". That's questionable

                1) The past gives context and so we shouldn't dismiss it. In this case, it provides insight into why the current tax structure is the way it is. If the underlying premise change (ie the nature of work changes) we should look to also changing the tax system. To me, that is looking forward, but using the context of the past to inform our judgement. Imagine if the past tax was based only on agrarian income because that formed the basis of the economy when the country was founded; I don’t think we’d want that same tax structure in a modern economy when the proportion of farmers is in single digits.

                2) I think this ignores the overall system. The economic system isn't one dimensional. Society benefits from the factory, but the factory also benefits from society. They don't work in isolation, so a single end goal ("just skip the factory and give people money") is an over simplification of the system and its goals.

                3) But you would all pay an income tax, so (in an ideal implementation) it balances out. No automation tax collection, but higher income tax collection. The inverse is true when it spills over the automation tax threshold: lower income tax on per capita production, but higher automation tax collection.

                >I'm not sold. Actually, I think there's more to it.

                Of course there is, and I've admitted there would be a lot of nuance. We should be careful that the nuance isn't gamed, but that doesn't mean the best alternative is the current system, or worse, an overly simple system like a flat tax that disregards nuance completely.

                >There's been an historical association between people's job and their sense of identity

                Yes. There's some benefit to this, but it can easily spill over into a toxic mindset. I don't want to live in a society where people's identity (and potentially moral self-worth) is at the whim of an employer. I think the job/identity coupling is a very Western mindset (probably rooted in a Puritan work ethic), but not a particularly healthy one. That's why we have issues with diseases/deaths of desperation like you allude to. I think the better solution is to decouple people's self-worth from their work, rather than ensure people keep working.

                >The oddball reality is we're sort of there already I hear this a lot, but it reminds me of when I was in college (decades ago) and I had a professor (who was nearing retirement at the time) and he spoke about when he was in college he had to write an essay about how people would manage their lives when they no longer had any work to do because of all the efficiency gains that were just on the horizons.

                >As a species we are enslaving ourselves to the perpetual feed through laziness.

                I think this is the Aldous Huxley A Brave New World viewpoint (written in 1931) so I don't think it's anything new. FWIW I tend to agree.

          • gosub1003d

            It would boil down to a profitability tax. As an exaggeration: a company with 7 people on the payroll that brings in $10m a year has to pay it. But a company that employs 700 and earns the same does not.

            • contingencies3d

              I've lived in China and seen the results of "CCP says you have to employ X people". The result is a bunch of people sitting around doing nothing, earning low incomes, and feeling despondent: the so-called "iron rice bowl". The system will be gamed by the managers, the people will merely be pawns.

              • gosub1003d

                That's a valid concern, but not an inevitability. I think if they don't want to pay the tax, they could move their business to a tax loophole state like Wyoming. But they shouldn't be allowed to siphon money out of a city without paying into the city. If doing business in CA gives them access to 30+ million potential customers, they can give CA a cut.

              • maxglute3d

                The alternative is unemployed people sitting around, earning no income, feeling more despondent... and bored. AKA how to get radicalized and doing worse than nothing in todays info enviroment. Did you live in China pre 90s, i.e. when actual "iron rice bowl" positions existed, they were above median income state job with good benefits (relative to income at time), but was about 1/6th of 600m workforce. Closer to all the random gov jobs being created on tax payer dime.

            • bumby3d

              Much more succinct description than mine :-)

      • ep1033d

        Oh, that's easy. I choose #3

        3: The top of the economic ladder reacts to worsening conditions by switching from promoting riskier growth based economic policies from which they might glean opportunities from new wealth, to allowing only status-quo reaffirming policies in an attempt to protect their already aggregated wealth and positions of economic power.

        This leads to worsening economic and social conditions for all non-elite parts of the population, as their economic and social issues are left unaddressed and thereby worsen under the status-quo.

        This leads to more pressure for the body politic to act to solve those problems, but as the top of the economic ladder now only endorses policies that support the status quo, a new political movement will need to grow that focuses only on stagnating and blocking any attempt for the government to act. All proposed reforms that allow the government to act for the benefit of the masses are blocked, all existing abilities of the government to help the masses are hollowed.

        This worsens economic and social conditions, which then means this becomes a catch-22.

        The political entities that are responsible for this dynamic, out of quiet guilt, instead begin lauding themselves that their actions are not the cause of worsening conditions for the masses, but are instead the realization of the representation of the ideological soul of their nation. This leaves these individuals intentionally deaf to the possibility that they might be wrong, and further reinforces their inability to compromise or present solutions that would challenge the current status-quo. This also becomes a catch-22, and can reach the point of fetishization. It also increases the tendency of this political class to promote policies that are ideologically driven political projects that hurt the country economically, because they fundamentally arise from a position that large parts of the population are not true citizens and deserve punishment.

        If this trend continues unchecked, as things continue to worsen, merely rendering government unresponsive becomes insufficient. So it results in the election of strong-men type characters that sell themselves to the populace as being the only ones capable of breaking the political deadlock that prevents solutions to issues within the country, while simultaneously promising to their richest funders that they will actually use their power to further entrench the positions and wealth of the existing economic elite.

        This creates a new group of politicians that seek power for themselves by exploiting this new dynamic, where they compete by their willingness to break cultural norms in order to service these two groups, and much later, a reactionary progressive cadre that functions similarly on the opposing side.

        This new dynamic worsens the economic and social issues at play, as the crumbling competency of hollowed out social norms and institutions lose their ability to function as effectively for the mass populace until the standard of living has fallen sufficiently to match the capabilities of the country to support it, and an increasingly deaf and authoritarian political class is far less effective at managing the needs of the people than in the previous more-decentralized state. Quality of life continues to drop, either slowly over time, or violently.

        At this point, the new political class will consider usurping power and wealth from the rich elite, as there are no longer legal norms to constrain them.

        Regardless, with decreased capacity in an ill functioning state, where the new political elite has acquired power by servicing an elite class focused on maintaining wealth instead of driving growth in the nation, the country loses its ability to compete economically with rival nations, and begins falling further and further behind, thereby worsening the above cycle.

        But what you have to remember, is its all worth it, because the alternative includes potential taxes, promoting policies to raise wages, or reinvesting in your fellow countrymen, and those things are all worse than what I just described /s

    • thucydides3d

      What do you mean when you say the economic pie continues to shrink?

      Since 1960 American GDP has more than tripled in real terms (constant dollars): https://fred.stlouisfed.org/series/NYGDPPCAPKDUSA

    • vishnugupta3d

      To add four year degree is a very recent phenomena. For most of the people it was an exception. The norm was to become an apprentice after or during high school and then go on to become a master tradesman. We might just be seeing a reversion to the norm.

    • amanaplanacanal3d

      The whole AI thing is just a symptom, I think. The real causes are:

      1. The boomer political class that is perfectly fine with spending more and more money but unwilling to pay for it, mortgaging everything the previous generations built. Basically sucking all the value out of it for themselves so that their children and grandchildren will have nothing.

      2. A big symptom is that tax rates on the richest have been going down for the last 50 years, and are now ridiculously low.

      3. The thing that makes that work: unnaturally low interest rates, for the last several decades.

      4. A complete unwillingness to use antitrust to break up the largest companies.

      I don't see how this is going to turn around, because the propaganda is now so good they have convinced the poor that all of this is somehow good for them.

    • libertine3d

      > It appears to me that our socio-economic model simply doesn't scale with technology. We need to have a constructive conversation about how to adapt.

      This doesn't add up with the number of billionaires and the growth over the last decades - it just seems that there's a bottleneck and the value captured from technology isn't trickling down.

      Of course, you can claim that such "billionaires" and that part of that "value" is the result of speculation, but since billionaires can use that speculation to buy money and acquire more assets, then for sure there's some value there that isn't getting to the vast majority of the people.

      • SR2Z3d

        My 2c is that most normal people are incapable of capturing this kind of value simply because they aren't educated enough on how to use a computer to do these things.

        ChatGPT is the first time in a while that a new technology is easy enough to use and integrate that the productivity gains could be captured by the lowest-skilled workers.

        Uber/Lyft are the antithesis of this: their apps do very simple things but at scale and with an eye towards confusing their drivers. It's not unthinkable that a co-op version of these apps could exist, but the folks with the skills to build them get snapped up by big tech pretty quick.

        IMO if we want to share the wealth, the ONLY way to do it is by upskilling workers and simplifying technology.

        • criddell3d

          > a co-op version of these apps could exist

          There are a bunch of white-label ride share apps. When Austin banned Uber and Lyft in 2016 it took about a day before new services popped up to replace them.

        • sssilver3d

          The problem with building Uber and Lyft isn’t the process of writing the software — that’s been done many times over.

          The hard part of course is investing the money in marketing and support to match the user recognition and experience that Uber and Lyft, subsidized by VC wallets, can provide to their customers.

        • libertine3d

          One of the things you have to take into account is that a lot of these tech companies grew in the gaps of regulation, while legacy companies were forced to abide by standards and regulations; for a lot of tech companies, they broke the markets and thrived in the infrastructure paid for by taxpayers.

          Uber, Airbnb are good examples of this. When regulation caught up with them, the damage was done for better or worse. To this day, some regulators still struggle to keep up due to the lack of resources, and the tech does achieve this... and now you have tech oligarchs dismantling the government bodies responsible for regulation.

          So I don't think it's just a matter of people upskilling to generate more value, because people pay taxes for the work they do - they can't afford the lawyers and structures to avoid this. The people who have the luck and the opportunities to generate this kind of value are already doing it, or will do it, or don't want to do it.

          So while I agree with you, and people should have access to the opportunities to upskill and to generate value, I also think there's a massive gap in income and taxes in a lot of democratic countries. These things can go, and should go, together.

  • bryanlarsen4d

    At least for the moment, AI still needs knowledge workers to spec and prompt and check. AI makes knowledge workers more productive, but it doesn't eliminate the need for them.

    And if knowledge workers are more productive, then knowledge work is cheaper. Cheaper knowledge work increases demand for knowledge work. So the number of workers required might actually increase. It also might not, but first order analysis that assumes decreased knowledge workers is not sufficient.

    C.f. garment makers. Partial automation of clothes making made clothes cheaper, so now people have closets full of hundreds of garments rather than the 2 sets our great-grandparents likely had. There are now more people making garments now than there was 100 years ago.

    • gopalv4d

      > Cheaper knowledge work increases demand for knowledge work.

      This is Jevon's paradox.

      > So the number of workers required might actually increase.

      The increased demand for work turning into new jobs for existing workers, that is where the question is more complex.

      This has gone the other way too in matters of muscle - people who wouldn't have been employed before can now be hired to do an existing task.

      When you go from pulling shopping carts to an electrical machine that pulls carts for you, now you can hire a 60 year old to pull carts in the parking lot where previously that job would be filled by teens.

      This is all a toss-up right now.

      In an ideal world, I will be paying less for the same amount of knowledge work in the future, but as a worker I might get paid more for the same hours I spend at work.

      My hours are limited, but my output is less limited than before.

      • fabfoe4d

        It’s actually the Jevons paradox, non possessive, named for William Stanley Jevons. I thought it was possessive too because many people write it that way.

        • hombre_fatal3d

          I don't see why it can't be written possessively since he stipulated it. It makes more sense.

          You would write it Jevons' paradox. Six sources on its wiki page write it as such in their title.

    • jillesvangurp3d

      There are several dynamics here.

      1) AIs enable knowledge workers to be more efficient. They don't replace them. But they'll get more done. So, you might need fewer of them.

      2) This frees people up to do different, more valuable things. There's a scarcity of people on the job market. We have record employment, not unemployment. In short, freeing people up to do valuable things that need doing is a good thing.

      3) A lot of the economy is the service economy. It's not about producing goods, or providing essential things like food, health care, etc. Instead it is about providing services to people at some value. The reason the economy has transitioned to that is the industrial revolution. A few hundred years ago, most of the economy was about scraping together enough food for people so they wouldn't starve. That's a solved problem. Farmers use machines instead of dozens of employees. Some of those machines are autonomous.

      4) Economies are about value chains and upcycling low value resources to high value services and goods. AIs make certain things cheaper which just frees up spending on more valuable things. What's valuable is determined by people and what they value.

      5) If you fire all the people and replace them with AIs (the dystopian view of AI) and they no longer make money. So their spending behavior changes. The economy changes and adapts. Spending just shifts to things we value. Like maybe a human touch.

      6) You could argue that much of the economy is already bullshit jobs. Who needs managers, marketing experts, social media influencers, and all the other fluffy jobs that we invent? Somebody values that. That's why that stuff exists.

      • johnnyanmac3d

        2) and 5) assume people let go or fired have ways to make money afterwards. That's the high question mark no one is really considering right now. People can't spend money they don't have.

        I think people talk excitedly about this post-labor society without considering how we upkeep it while all the value labor is managed by billionaires and worked on by an exceedingly small labor force maintaining the real labor force of AI. Current directions don't support topic ideals like UBI.

        • jillesvangurp3d

          I don't subscribe to this narrow view of economics. The AI fallacy is that whatever they do cheaply just stops being valuable. We pay a little for it, but we spend most of what we have on other things. What that is shifts over time. Pre-industrial revolution it was food and agriculture. Then it shifted to manufacturing. Today it's people doing stuff for other people (aka. services). Over time, we work less and less and we earn more and more.

          You mention UBI. You could actually argue that we already have some notion of that. It's just a terribly inefficient, poorly administered, and very costly, and not very good version of it. People don't starve, they mostly have access to health care (the US being an exception to this relative to most countries, including most developing nations). And shelter too (people freezing to death because they can't hide from the elements is pretty rare). Many people live on what they get for free. It's called charity, social security, unemployment, pension, childhood, etc. But one way or another, somebody provides for them. And being dependent on just the free stuff is something that would horrify most normal people. But it's there for pretty much everyone.

          Feudalism is cyclic. It comes and goes. We had a lot of it early last century. And then we got communism, socialism, unions, etc. And a surge in economic wealth for the middle class after things were rebalanced. The post WW II economic boom in the US was powered by Roosevelt's new deal. I don't think there's much respect or loyalty to the current batch of trillionaires. They can exist only because people allow them to. Future governments could find themselves empowered and tasked to do something about the economic wealth distribution. Such things have happened before. Often after some kind of revolution.

        • cma3d

          Progressive taxes

    • bitxbitxbitcoin4d

      I wonder how the ratio of people making garments relative to the total world population has changed though in this example.

      • bryanlarsen4d

        No easy answer since most garments > 100 years ago were home-made. But I can confidently assert without data that the number of man-hours of labor in the average closet is substantially up.

        garment makers chosen because of this recent discussion: https://news.ycombinator.com/item?id=43450515

        • MITSardine4d

          How can you be so sure?

          I found this document: https://web.archive.org/web/20210126040017/https://ribevikin... It asks the question "How long would it take to make a Viborg shirt?". The answer seems to be 354 hours per their experiments. This is from seed to shirt. (linen)

          I'd be surprised if we had that many man-hours, let alone 3 or 4 times that (this is a single piece of clothing), in our wardrobes. Conservatively assuming a man-hour in the wardrobe costs us $5 (while people are often paid less, their salaries are also but one expense), you'd need at least around $1500 to equal just that shirt.

          • bryanlarsen4d

            I'm talking 100 years ago, not 1000. We've had mechanized fabric production for > 300 years.

            From your document, weaving and spinning are > 85% of the labor in your shirt. Those would be almost 0% for a shirt made 100 years ago. And those wouldn't be the only steps mechanized 100 years ago.

        • watwut4d

          100 years ago is 1925 - people were already buying cloth in stores at that time. You need to go further into history.

      • bobthepanda4d

        the more obvious recent example is that we employ more bank tellers than we did before the ATM, because the ATM reduced the amount of labor hours needed to operate a bank branch and made a lot of marginal bank branches pencil out.

        Only the recent trend of online banking services is really actually turning this around.

    • MoonGhost4d

      According to this cheap food is good because people just start eating more. Actually cheap imports can be really bad for local businesses.

      Now imagine dystopian world where AI can solve most data / engineering / science problems for cheap. Or even for free, just ask.

      • delusional4d

        AI working isn't the dystopian option. Imagine AI can do none of those things, but the people who control the capital believe they can.

        That's the dystopia.

        • bluefirebrand4d

          > Imagine AI can do none of those things, but the people who control the capital believe they can.

          I don't need to imagine, that's the reality we live in right now

          • MoonGhost4d

            If you recall how it was just 4 years back. Since then more progress in AI than in previous 40. Which in turn better than prev 400. It's accelerating and unstoppable. It will be very different world in 10 years from now. I hope I live this long...

            • johnnyanmac3d

              Idk if I'd call it "progress" at this point. There were a few big steps and then companies poured billions to shove it in our face. And from what I hear we hit a plateau with current approaches.

              I don't think AI is really following Moore's law here.

              • MoonGhost3d

                > I hear we hit a plateau with current approaches

                Then we need new. The important last year step was distillation as mainstream. In my opinion. Now using old models to train new is normal or even necessary. That was done before, but it was sort of experimental. Creating targeted datasets is a very powerful thing. Now big models can be trained on quality data instead of internet random mix. This includes long thinking and tools use examples from the beginning and not as fine tuning.

                Another way of thinking is AI is steadily getting close to human level IQ. Not approximating, it will cross the line. This distance had reduced dramatically in last few years. Then it's singularity that everybody was talking about for so long.

                This year we already have google's robotic multimodal. It's closed, but likely will be reproduced. Significant step toward useful generic robots.

                • johnnyanmac3d

                  We do. But we're not in an innovation environment anymore. It's pump and churn what we're currently doing and hope we brute force "intelligence". Ironic situation.

                  There's lots of promises out there but not much action nor real world appeal to this stuff. That's pretty much textbook gifting as of now.

        • Enginerrrd4d

          Yeah and not unlikely.

          There are trends in both directions in manufacturing. On the one hand, we had amazing fountain pens with forged gold nibs hand-tuned by a nibmeister that can do things you can't replicate now. The trend was toward really crappy pens that could be mass-manufactured and required less skill by the consumer to operate, but are demonstrably inferior. People chose to buy the cheaper, barely passable option more often by orders of magnitude.

          On the other hand, we have automotive manufacturing. Modern cars are more reliable, safer, more comfortable, more performant, more featured, capable, (and expensive?) than ever.

          With knowledge work, it's clear that certain things will get "optimized" into disfunction. (Look at the complete lack of agency corporations have taken away from the people operating the public interfaces to the company.) Be it help/support centers, resolving unusual issues, or dealing with automated bans/account deactivations.

          It's hard to say what direction it will go...

      • zombiwoof4d

        Free after it took a trillion in energy to train

        • mystified50164d

          How much time, energy, and resources went into developing your language of choice? How much datacenter energy is burned recompiling GCC every night for the last 30 years?

          • johnnyanmac3d

            Our brains are pretty efficient, so magnitudes less, yes.

            I'm also sure you can take an average personal computing consumption and multiply that by # of PCs to approximate your compilation example. I'd wager it pales compared to what the Last year of crypto mining has done alone.

          • Chaosvex4d

            Comparatively little, I'd imagine.

    • timewizard4d

      > There are now more people making garments now than there was 100 years ago.

      Exponential population increase only began 75 years ago. This is correct but the analysis is wrong.

    • downrightmike4d

      Current AI is just a word calculator, it hallucinates because it doesn't know what should come next, a 5 is a 5 to it. LLM will never be general intelligence

  • tangotaylor4d

    > A recent academic paper found that software developers who used an A.I. coding assistant improved a key measure of productivity by more than 25 percent, and that the productivity gains appeared to be largest among the least experienced developers.

    I dunno about this citation. I just read the paper and it considers "productivity" in this context to be the number of builds, commits, and pull requests in a developer week. Interestingly, there was no statistically significant difference in build success rate between those who used the AI tool (Copilot in this case) and those who didn't.

    https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4945566

    • Derbasti4d

      Personally, I've seen junior developers most negatively impacted by AI. It seems to stymie learning and reasoning skills.

      • globular-toast4d

        In my experience, the best developers out there have all worked through hard problems by themselves. "Been through the trenches", so to speak. That means getting right to the bottom of difficult bugs etc., no matter how long it takes.

        On the other hand, the worst developers reach out for help the moment things start to look unfamiliar to them. Over the years I've tried to encourage such developers to think for themselves, but I've come to realise some just can't do it if they know someone else on the team probably has the answers.

        Availability of LLMs seems to be the worst thing for this kind of developer. Now they don't even have any kind of social barrier to reaching out for help. I just don't see how they're ever going to learn to do anything.

        Code monkeys have always existed, and can be useful, but why wouldn't the engineers just go straight to the LLM instead of going through code monkeys?

      • fransje264d

        There is a recent paper by Microsoft that describes exactly that. (Although they didn't restrict the negative impact to junior developers)

        https://www.microsoft.com/en-us/research/wp-content/uploads/...

      • toxik4d

        There is definitely an effect of ”you forget what you don’t need to remember” happening even to me.

  • Glyptodon4d

    Yes, but I'm not sure it's entirely related to AI. I think one of the real lessons of the pandemic was that you can outsource everything knowledge related because paying peanuts for something that's somewhere from 1/8 to 1/2 as good is better than not paying peanuts. And this effect is deeply multiplicative the less skilled and more entry level the role is, a factor AI only exacerbates. The precursor to this was tech jobs postings, year by year, mostly only being for, first, 3+ years of experience, and then 5+, and so on.

    That said, AI, as it improves, will only exacerbate. At least until it becomes so multiplicative that having an AI + pilot is so productive that you'd never not be happy to hire another one.

  • tootie4d

    Everybody is commenting about AI, but that's barely mentioned in the article. In particular there is this quote:

    > The gap in wages between those with a college degree and those without one grew steadily beginning in 1980, but flattened during the past 15 years, though it remains high.

    I suspect there's a lot of factors at play and the expansion of access to college is part of it. The value of secondary education is somewhere diluted. There is the end of the zero interest era. There is massive layoffs of federal workers which skew heavily towards knowledge work. And some of it is was Starbucks cites as "removing layers and duplication and creating smaller, more nimble teams" which is a pretty normal cyclical thing where an era of expansion and promotions makes your org to top-heavy and bureaucratic so you trim aggressively and probably end up rehiring some as needed. Like a controlled burn of an overgrown forest. And if some of those are permanently lost to AI then that's a bonus.

    I think a lot of business leaders are just expecting a recession and want to be on good footing before it hits.

    • bjornsing3d

      Personally I think it’s because a university degree used to be a proxy for cognitive ability and conscientiousness. Now that many more people have them that’s less true. The market is starting to realize and we’ll see some “corrections” going forward.

    • KurSix4d

      Companies got fat during the boom, especially post-2020, and now they're shedding layers to get leaner and more agile. It doesn't mean those jobs are obsolete forever, just that the pendulum is swinging the other way for now.

    • milesrout4d

      >Everybody is commenting about AI, but that's barely mentioned in the article.

      As usual, they didn't read the article. They read the headline, assumed it was about AI, and wrote the same boring "management bad, AI bad, the present bad, programmers good, the past good" rants.

  • jongjong4d

    Unfortunately, knowledge and intelligence has been losing a lot of value over the past couple of decades.

    Aside from tech making information more accessible, centralization and monopolization likely played the biggest part. Driven by the design of our monetary system. Most new money enters the economy backed by endless streams of real estate debt (mortgages), public debt and corporate debt... All these well-moneyed areas (real estate, government, corporations) have become monopolized and highly political. There's just no room for real knowledge work. It's all about status games and BS internal politics. People who benefit from Cantillion effects have no interest in bringing nerds into their organizations to compete against them for their spot in front of their easy money printer.

    Insiders are mostly hiring people who are dumber than themselves; people whom they can control.

  • nopelynopington4d

    I flip flop daily on whether it has or not. Even the best AI engines write truly awful code, and it might not improve. But it also makes it easier for people to coast, and turn in half assed work, which is certainly a pathway to the decline of knowledge work

    • nzach4d

      > it also makes it easier for people to coast, and turn in half assed work, which is certainly a pathway to the decline of knowledge work

      I understand your sentiment and I partially agree with it. But this kind of phrasing implies that "doing the bare minimum" (to put it in another way) is a strictly bad thing.

      Sure, its easy to condemn someone "half-assing" a job by labeling him as lazy or something like that. But the reality is that most of the time we don't need the best nor we are willing to pay properly for this effort.

      Imagine your baker, for example. Do you really need 100% of his effort and care to be put into every single bread he makes? For me this answer is "no". All I care is that he comply with all regulations and that his bread tastes good, I don't really mind if it's not best bread in the world. And even if it was the best I probably would find it too expensive to buy in a daily basis.

      Another example would be blacksmiths, at some point they we our only option to make something out of metal, and they would put quite a lot of care and attention to every piece they made. But at some we created some machines that can create things out of metal. These machines, at first, weren't really good and the products they made were of inferior quality. But they had enough quality to be useful, were cheaper and were able to produce immense quantities of goods.

      What I'm trying to say is that sometimes the "low effort" option is the correct choice. And I don't think this means the decline of knowledge work, this just means we will see a change in what is considered "relevant skills" for knowledge work.

      • nopelynopington2d

        > Imagine your baker, for example. Do you really need 100% of his effort and care to be put into every single bread he makes?

        But the analogy here is, if all bakers started using bread machines every day, and new bakers only learn how to ask the bread machine to make bread, the decline of baking will surely be a step closer.

        And sure we can quibble over tools the baker uses such as ovens or dough mixers or what have you, but ultimately they must know how to make bread. AI platforms attempt to remove the need to understand the code, so that people don't need to learn how it works to make it.

      • yoyohello134d

        I guess the thing is I don’t want to do the “bare minimum” I want what I do in life to mean something. I want to work hard and care about everything I do. Whether that’s family or work or leisure. Coasting and doing the bare minimum is not a good way to live. Society is pushing people to spend more and more of their life on meaningless slop, then wonder why there is a mental health crisis.

        • spencerflem4d

          Totally feel this.

          It sucks, because so few things in tech _are_ meaningful, and exist for a reason other than to enrich whoever owns the company making it.

          • hnthrow903487653d

            Earnings for contributing to human knowledge also needs to be higher, which is why lots of people don't choose those paths and have to commit to grinding corporate ladders.

          • tucnak3d

            Stop worrying about rich people

      • milesrout4d

        >But this kind of phrasing implies that "doing the bare minimum" (to put it in another way) is a strictly bad thing.

        It is. It shows a lack of character. Have some pride in your work. Have some pride in yourself. Being lazy is pathetic.

        >Sure, its easy to condemn someone "half-assing" a job by labeling him as lazy or something like that. But the reality is that most of the time we don't need the best nor we are willing to pay properly for this effort.

        There is no such thing as "need". You don't need anything. People lived for thousands of years on a diet of mostly grains living in uninsulated houses with open fires. Everything is a want. People's wants are never satisfied, you can always want more.

        But even if nobody else will appreciate it, you should do the right thing anyway. You should do it because you take pride in your work.

        >Imagine your baker, for example. Do you really need 100% of his effort and care to be put into every single bread he makes? For me this answer is "no".

        Nothing to do with what I want. He needs to put in the effort. He needs to do it for himself.

        >These machines, at first, weren't really good and the products they made were of inferior quality. But they had enough quality to be useful, were cheaper and were able to produce immense quantities of goods.

        The men that designed the machine, or ran the machine, or made the moulds, or assembled the parts, all of them worked bloody hard and were proud to do so for their families and for themselves. Work ethic was prized. This wasn't low effort. It was a different tradeoff between material inputs and material outputs, but it required no lesser work ethic. It didn't permit laziness or idleness.

        • callc3d

          There is nuance to the two separate ideas of "doing the bare minimum" and "taking pride in your work". I agree with you that taking pride in your work is important; I wish it were more highly valued nowadays, honestly.

          The flip side is that "taking pride in your work" nearly always results in being taking advantage of from your employer, at least in salaried positions. And if you can spot the social patterns and games played such as valuing employees that work overtime (without pay), on weekends, etc, it is clear that employers love getting more value from employees without compensating them. Work extra hard for 6 months to maybe possibly get a promotion? People are generally waking up to this reality, hence the 'quiet quitting' mindset.

          One can both take pride in their work, and respect their time by adhering to their employer-employee contract as written.

          Lastly, in the baker example, they have a direct reason to put in their best effort (assuming the baker owns their bakery): they will gain goodwill and repeat customers if they bake very well. A salaried worker is so far removed from being directly compensated for their work. I predict the situation would be very different if salary work got commission based on sales and overtime pay.

      • johnnyanmac3d

        >All I care is that he comply with all regulations and that his bread tastes good, I don't really mind if it's not best bread in the world.

        That's part of the issue. They ignore regulations and the bread has mold. But we eat it and say "well I'm not dead". Because we're being conditioned to eat, not taste. To consume, not question.

        Meanwhile, I complain the bread tastes stale and moldy and I get argued down by fake bakers that "no you don't understand this is the future of bread". Well, it sucks. I don't csre how much you're paid to say otherwise or promise they it'll taste "good" (read: not crap) in a few years. I'll go to my bakery until then instead of having your bread shoved down my throat.

        Make it taste like bread first instead of hyping up how it looks so close to bread. That's the whole issue causing the downfall of society.

        • ratorx3d

          There’s also the case that the regulations don’t exist.

          And what’s more worrying is things where the negative impact is higher order.

          If the bread has some poison that will kill you in 5 years time etc.

          Currently we maintain a bar partially with human ethics and processes, whether that is directly preventing bad outcomes because of liabilities or reflecting on bad outcomes once they happen to improve regulations (a lot of which relies on introspectability).

          Once AI starts replacing the decision-making layer, we lose the collective understanding of how processes fail. Once you start needing to constrain the space of machine error, you’ve basically arrived at almost solving the problem again.

          • johnnyanmac3d

            Yes, I do appreciate my FDA making sure any properly rated eatery isn't potentially serving poison. Another big issue as of late to worry about asubgpvernmejr decides shilling crypto and EVs (which he ended the tax credit for... oh, and not tarriffs!) is more important than simply keeping regulatory bodies operating.

      • singleshot_4d

        > Do you really need 100% of his effort and care to be put into every single bread he makes?

        Nope; just mine.

    • kccqzy4d

      You are describing quiet quitting, which is a reaction to the period of overwork and burnout during COVID.

      • milesrout4d

        "Quiet quitting" is just laziness from people that don't take any pride in their work. It was not a reaction to "overwork" or "burnout". Laziness has always existed. This was just a new name for it.

        • johnnyanmac3d

          Quiet quitting is realizing that you want to move on in your career but every interview process takes 2+ months for beauracrit reasons instead of actually judging your ability to do the job.

          Quiet quitting is having no safety net when you're no longer satisfied with your job for any reason but your Healthcare is tied to your employer as ransom.

          Quiet quitting is realizing you are going to be terminated in the next wave of layoffs 3-6 months later and that your efforts will not save your job anyway, so focus on jumping ship.

          Greed has always existed, this was just a new name companies tried to gaslight with. I don't want to hear about "lazy workers" in a time where layoffs (that aren't performance based) are only increasing in a society thst decides it lawful to treat an employment contract as as a toy to be thrown away at any whim.

  • carabiner4d

    If you look at fields like mechanical engineering, no, it hasn't, because there isn't much training data for that type of work available on the internet. (CAD isn't engineering.) It's locked up in corporations like Ford, SpaceX, Toyota. There isn't open source mechanical analysis work available at a professional level, which might be an FEA output that references 5 messy spreadsheets that gets written up in a 20 page report with charts, and that references 20 other internal docs, specs. And every company does analysis differently so I'm not sure you could adequately train a model that can generalize it well.

    • mywittyname4d

      I think this is a flawed outlook.

      My wife works in healthcare and they just announced that AI would generate 50% of their documents. According to my wife, every in every review she's done of the work, there are critical errors in logic or reasoning (hallucinations), but leadership doesn't care because they a "grading" the AI on things like grammar and spelling. So, yay, it spelled "fusospirochetosis" correctly, but unfortunately it attributed that to a 23 year old woman who came in for a broken arm.

      I think we need to be prepared for "vibe coding" taking over every industry. Leadership is going to blindly implement broken AI to replace workers.

      Now, you might think that threat of lawsuits, regulations, and ethics standards will save your industry. And maybe that's true, but I personally don't think so. I personally think we are going to see country-scale enshitification of everything as lazy people use AI to generate everything which is then reviewed by lazy people who also use AI to review everything and no one notices or cares until it's too late (and maybe not even then).

      I hope you're right, but I'm not convinced.

      • pianoben4d

        "Medical (vibe) Coding"

        Oh lord, that is a phrase I did not need to think about today

        • Karsteski4d

          I am quite literally sleeping in a hospital while I read that. Horrifying.

      • carabiner4d

        Profit and a competitive market are powerful incentives that keep people from dying in the industry of human transport, that don't exist as much in hospitals. Also, if you think "threat of lawsuits, regulations, and ethics standards" don't encourage safe cars or airplanes currently, what do you think does? In engineering currently there are 1,000 ways to cheat (and sometimes they do, like Volkswagen dieselgate), but reports are actually checked rigorously multiple times by other engineers at a company and by federal regulators (who are also engineers). Things get rejected and changed based on these reviews. Why would AI engineering make this different?

        If AI could cause a fiasco on the level of Toyota's unintended acceleration debacle from 2008, which caused the largest auto recall in history, that is something that would give pause to using AI, no? If AI leads to bad products that kill people, companies will lose money and they will not want to use AI.

      • mrweasel4d

        > I personally think we are going to see country-scale enshitification of everything as lazy people use AI to generate everything which is then reviewed by lazy people who also use AI to review everything and no one notices or cares until it's too late

        Sound about right to me. A large quantity of documentation is written solely to be written, not read. If it's your job to write reports, article, memos, whatever, and you deep down know that this isn't being read, then why not have to AI do it?

        I think that we're producing way to many documents/reports/content in general and we need to slow down. Humans can't keep up, neither on the production side, nor the consumption, so we employ computers to "help", but it's all busy work.

        • eulers_secret4d

          This reminds me of my first internship, I was running a test suite and filling out a web form every week. It was never mentioned in any meetings or other comms.

          After about 3 months of doing this, I asked my manager why I was doing this if no-one cared or noticed. He told me to stop and see if anyone said something. I stopped, interned another 15 months and it wasn't ever mentioned.

          Hell of a lesson to learn as a newbie

          • johnnyanmac3d

            3 months later a showstopper occurs and heads are rolling. The poor release manager wants to pinpoint when this breaking change occurred, but the last test suite ran was almost 2 years ago. They didn't care until it was too late. As fitting for American culture.

            Not saying you were wrong to drop, but multiple people dropped the BL there if there how they are treating their testing.

        • johnnyanmac3d

          >documentation is written solely to be written, not read. If it's your job to write reports, article, memos, whatever, and you deep down know that this isn't being read, then why not have to AI do it?

          Medical records seems to be the opposite of this. Nurses document to report to doctors so they can diagnose and advise. That seems like the worst step to cut corners on. Accountants document reports for patient for prescriptions and diagnosis that can affect their lives. Heck, as a CYA you document records in case a law firm comes in demanding papers to read. Do they think lawyers don't care about this stuff?

          Just because not everyone is gonna pour over ever word doesn't mean no one will.

        • mmcconnell16184d

          Idiocracy in real life. The AI told me to drink Brawndo. Why? Because it has electrolytes. My AI said electrolytes are good.

      • thechao4d

        Idiocracy was supposed to be funny, not terrifying.

      • johnnyanmac3d

        Why is it that lazy people are kicking out the motivated? I thought you had to be motivated to move up in the world? Or to understand and interpret law? Or to not take complete trite from a 20 yo MBA who just see numbers instead of lives?

      • BlarfMcFlarf4d

        Certainly you won’t be saved in the US, where things like consumer protections are quickly becoming a thing of the past. And the US has traditionally exported its business practices to the world.

  • 7qW24A4d

    Corporations that over-hired over the past 10 years needed an excuse to cut the layers of fat and bureaucracy out, and AI came along at just the right time. It doesn’t matter if AI is increasing productivity; what matters is that people think it might be.

    • bee_rider4d

      Maybe…

      I dunno. Anyone who’s been at a big company (the kind that are inclined to over-hire) can attest to the massive population of folks who don’t seem to have any productive job. Does AI spell doom for those people? I have no idea, their job wasn’t to be productive in the first place. Being a warm body to expand the size of some middle managers fiefdom—that is not a job that can be taken by AI, right?

      I guess maybe just engineering jobs are on the chopping block. Doomed, by the nature of their productivity actually being tangible, to be replaced by an AI that does their job worse.

      • mrweasel4d

        > Being a warm body to expand the size of some middle managers fiefdom—that is not a job that can be taken by AI, right?

        If you over-hired and now need to layoff people, because founding and cheap loans aren't available, then AI provides a convenient excuse. You don't have to admit that you paid people to do nothing/very little for the last ten year.

    • mywittyname4d

      Maybe in '23 & '24 that argument made sense. But presumably the past two years they've spent laying everyone off has been enough to trim the fat.

      Honestly, any company that's still laying off people that they "overhired" during the pandemic needs to take a hard look at their leadership teams, because they've had ample opportunity to be rid of such people.

      • johnnyanmac3d

        I don't think it's that complicated to begin with. Free money, focus on growth. Not free money and looking recession, cut cut cut. AI is a convinent buzzword to keep shareholders happy, no matter how much you actually invest in it.

        I don't think it was a grand conspiracy. Executives are just panicking and don't have much more of a clue than the workers. It's just too bad they could weather the storm regardless while people relying on paychecks suffer consequences.

    • riku_iki4d

      The problem is that overhired personal created lots of extra process/org/code/infra complexity, so just by firing X% of people may produce significant damage because rest won't be able to handle that extra complexity or have enough institutional knowledge.

    • timewizard4d

      Mergers and acquisitions have been off the charts during this time. No one over hired. They over acquired.

    • downrightmike4d

      This is just fluff for distracting from outsourcing

  • iambateman4d

    All knowledge work follows a distribution of skill…lots of people are in the middle of the curve when it comes to ability to do a varied set of tasks.

    If I quizzed someone in the middle of that curve on (1) basic keyboard shortcuts, (2) how to VLOOKUP in excel, (3) how to buy a domain, (4) how to make a presentation that truly looked good, (5) detect if an email is a phishing campaign, (6) make a secure password, (7) do research in their industry…

    You get my point…The skill of an average knowledge worker today is miles and miles away from elite levels.

    Knowledge work is not declining, we’re just watching the average ability of a typical worker go way up, over the course of a generation.

    • poincaredisk4d

      Was this person employed in It? Did you ask them to quiz you in return? "How to buy a domain" is exactly something a HNer would ask as an obvious question. I wonder how many of us know when to plant carrots, the chemical structure of benzene, or can show Malawi on a map.

    • johnnyanmac3d

      >The skill of an average knowledge worker today is miles and miles away from elite levels.

      Wouldn't that by definition always be true? Average and "elite" are dynamic values. I'm unsure if that gap is increasing or not.

  • owenpalmer3d

    Jonathan Blow has been talking about this problem for a while now. I especially love his talk "Preventing the Collapse of Civilization"

    https://youtu.be/ZSRHeXYDLko?si=HgGG5PWyu7CBxYxt

  • coolThingsFirst4d

    Yes, it has for a very long time I just got the memo too late.

    I will focus on making small startups(will clamor around for ideas since I'm not very creative) since getting a job is ALMOST impossible. I've GitHub projects, degree, don't expect anything crazy since COL is low in EE but it feels like unless you are Jeff Dean nowadays is auto-rejection.

    Very low number of callbacks. I do have a multi-year gap since graduation which I know it impacts but this market is INSANE and has been for years.

    For comparison, just with HS degree while in uni, I had more callbacks for orders of magnitude better companies than I do now. It's like instead of progressing my career prospects have winded down. Crazy times.

  • stuaxo4d

    Productivity keeps going up, now to the point where there is less and less work.

    Instead of paying people more and reducing hours we let enough be taken by some at the top it unbalanced the whole system.

    We should be working 3 day weeks on the same wages by now.

    4 day week would be a start, it would give parents enough time to do chores and have a weekend.

    • globular-toast3d

      Productivity is not a problem. We produce way more food than we need, way more cars than we need, way more plastic gadgets than we need etc.

      In fact, we produce so much that it's making people sick: obesity, diabetes, atrophy, cancer, loneliness, depression etc.

      At some point a long time ago productivity stopped being the thing that made us happy and started being the thing that made us sick.

      But our politicians never stopped using this crude measurement as a target. If there's one thing people love it's a number. Number goes up: good, number goes down: bad. Easy. But there's no number for happiness, so our political systems are incapable of targeting it.

      Now that productivity is long behind us we now need to start working out how to climb out of these local optima we've dug ourselves into. We need to solve problems like cancer. This doesn't get done by people going to a building and turning a handle all day. This gets done by thinking.

      • erikerikson3d

        > Productivity is not a problem

        GP didn't claim it was

  • Animats4d

    Demand for college graduates peaked long ago. Only about half of US workers with college degrees have jobs which require them.

    • hn_throwaway_994d

      I wasn't aware of this statistic, so more details: https://archive.vn/e0k2M

    • kjkjadksj3d

      Most job postings say something like bs saves you 5 years on the job experience though. And not a lot of jobs out there that are truly entry level zero experience or relevance required. Even programming, like good luck getting a job today without any years already on the job or a degree. Its not 2005 anymore.

  • MrMcCall4d

    It's the natural result of having more and more of our population being complete morons. The truth never goes away, but societies can shift closer to or futher away from being willing to learn and accept new concepts, attitudes, and behaviors.

    History is rife with examples: Galileo, St. Francis of Assisi, Boltzmann, Einstein, Dr. Martin Luther King, Jr., Eugene Parker, Bernie Sanders ... the list goes on and on.

    Most every tradition in Earth's various societies consists solely of excuses to keep doing the same stupid crap, for the same stupid reasons, with the same willfully ignorant disregard to better ways of doing things.

    Only compassion gives us the clear perspective on how we can best apply our energy to lift everyone's boat, without harm or misery to either ourselves or our blessed Earth.

    "Same as it ever was." --Talking Heads

  • phendrenad23d

    I wish that articles like this didn't have such a US-centric limited view of the world. It's possible that knowledge work is booming elsewhere in the world, yet this is not explored or considered.

  • zkmon3d

    "an M.I.T. economist who was a co-author of the paper, said in an interview that a software developer’s job could change over the longer term, so that the human coder became a kind of project manager overseeing multiple A.I. assistants"

    Really? An economist has the authority and understanding to say that? People are getting carried away by projecting the gains seen from AI assistants. They need to understand that they can't simply project some trends so linearly.

    The same people saw cars and aeroplanes and said that we are going to have flying cars very soon. Nice projection of the trends!

  • schnitzelstoat3d

    No, it's just that interest rates are higher.

    • underdeserver3d

      Underrated comment.

      My null hypothesis is that interest rates are over, fat is being trimmed, of course the people most affected are going to be those at the bottom of the food chain: inexperienced new grads.

      Sucks to be a new grad (as it did in 2008-9), but in macro terms I'd say wait and check back in two years.

      • ericmcer3d

        I am inclined to agree, but I was saying a similar thing in 2023 when the job market started to tank. Now we are 2 years later and it is similar, maybe a little better but definitely not a return to pre-2022 levels. It might end up being a 5+ year slow period by the time interest rates come back down and things start heating up.

      • johnnyanmac3d

        How do they survive for 2 years without a job?

        • underdeserver3d

          Really sucks to be them. Try hard and compromise on pay.

          I also remember seeing some study that says that 2008 grads' pay never caught up, and that e.g. 2010 grads' pay is higher despite their having less experience. Sucks, but you play the cards you're dealt.

          • johnnyanmac3d

            Damm shame. Really hope they remember this inevitable resentment when it comes time to vote on policy. It seems the '08 grads did not.

            • underdeserver3d

              Vote on what policy? What vote do you think could have made a difference?

              • johnnyanmac3d

                Direct votes will vary from state to state. But as a recent example in CA voted against rent control... Guess who funded against that prop?

                I was more talking indirect, but simply voting in whatever rep that sounded better without keeping the policies they focus on.

                • schnitzelstoat3d

                  I live in a city with rent control and it's a disaster.

                  It just caused a massive drop in supply and landlords moving to alternative stuff like temporary contracts to get around it. The latter could be legislated against but would probably just cause a further drop in supply.

                  Ultimately you can't cheat the market, if you want lower prices you either increase supply or decrease demand.

  • anonfordays3d

    Considering the crumbling infrastructure, high demand for trades, increasing deportations, etc. blue-collar workers will likely see their compensation increase while white-collar workers see theirs decrease. The next industry for SV to disrupt needs to be construction. White-collar workers will see a demand for software tools that increase the efficiency of the trades.

  • KurSix4d

    Still, I wouldn't say we're seeing the death of knowledge work, more like a redefinition. The bigger issue is whether our current education system and workforce training are keeping up.

  • squigglydonut4d

    If you want better than AI you must demand it as if life depended on it.

    • timewizard4d

      How about quality of life improvements for everyone on the planet first? I think life /actually/ depends on that.

      Goofy energy wasting language models take an extreme backseat; however, this is "private equity world" and the rules are just to rip everyone off regardless of long term outcomes.

      • milesrout4d

        What does this comment even mean? I understand all the individual words but in combination it is nonsensical.

        What are "quality of life improvements for everyone on the planet"? What does that even mean? Are you proposing some sort of communist system? Why, in response to an entirely unrelated comment on an unrelated topic?

        What the hell is "private equity world"?

  • karaterobot3d

    I don't see any clear examples in this article of people being laid off due to AI. I see one anonymous source who suspects his team was halved and that the higher-ups expect them to do twice the work using AI assistants. But, teams were slashed in half and expected to do more with less before AI, so I'm not sure that counts. I'm not naive enough to think AI companies wouldn't come after knowledge workers, but I'm waiting for confirmation that they are capable of doing that yet. If not, Betteridge's Law is confirmed in this case.

  • yuppii3d

    Could that also be, because the money is fake now? It does not actually follow the value, created out of thin air (?)

  • glitchc3d

    Is this anything more than just supply and demand? As a society, we have shunned physical labour as an effective means to earn a living. In the education system, we insist that everyone graduate from high school, regardless of the attained level of achievement.

    As a result, we have a huge supply of young people who are college educated and have eschewed the trades because they believe that physical labour is below their standing. They're not wrong, since that's what they've been told their entire lives. They're all looking for white-collar work, but we simply don't have enough jobs for all of them.

  • ctrlp4d

    Many "white collar" jobs are bullshit makework for college grads. There is nothing intrinsically valuable in them but they are necessary to keep the population occupied. When the economics become favorable again, there will be another iteration of creating these jobs with subtle ways to make the "career" seem meaningful when in truth they are just made up sinecures.

    • tucnak2d

      Do you work in politics brother?

      • ctrlp2d

        No sir, I work in technology. Sometimes adjacent to government, but never politics itself. Why do you ask?

        • tucnak2d

          You reminded me of someone! :-)

  • alecco4d

    Musk showed 80% were useless hires and AI gave the rest of the CEOs the excuse.

    So now they are downsizing and moving teams to India. From Google to American Airlines. What could possibly go wrong.

    • disambiguation4d

      Musk showed that it takes a lot of people to build a bridge and only a few to maintain it.

      Why anyone continues to use it is another question entirely.

      • typewithrhythm4d

        Twitter was built long before it's headcount reached its peak.

        It suffered through a fairly common problem where due to its value and revenue it tries to justify a huge headcount, usually due to managers getting paid based on headcount numbers. In reality it always should have had less numerous, better paid engineers, rather than management parasites.

  • talkingtab4d

    One could wonder if perhaps, as is usual, We people have it all wrong about this new technology.

    Perhaps AI is the death of corporations. Corporations are designed to extract money by charging as much as possible while spending as little as possible. That seems fair to me, right? We do not expect a corporation to understand the concept of common good. What AI does is to make it appear as though people are not needed, you can just replace them all with AI & machines. Profits go up, yes? Corporations make more and more money. What could go wrong.

    The first thing that is going wrong - currently - is that if you do not pay people enough money to buy things, then guess what happens to your business? Goodbye Apple. Goodbye Google. Goodbye moon. Oops, sorry. This is already happening. If the wolves on Isle Royale get too good at catching and eating deer, guess what happens?

    Secondly and perhaps most importantly is that corporations are good at exploiting. If you know how to make a car, a corporation can make one so easily that you do not need to pay any people. Exploit. If you see a corporation pursuing lock-in (here's looking at you Apple, Walmart, etc) then you see a corporation that has no good business model. This does not mean they are dead or not going to be around, it just means they are a dinosaur waiting for some event. But the question is whether they can invent the next car? And what if the thing in short supply is not in the category of commodities. We still do not really understand what we can do with computers, the internet or block chains. Corporations know how to exploit those technologies. After all we still have windows. But is that the best we can do? Track people? Wow.

    So if you are a person who can actually think, and do. If you can figure out how to cross a stream in the winter, or make sour dough bread or even just raise chickens, there is going to be a massive new opportunity. For you. Because AI cannot create new concepts that are people based and even if they can, then corporations will fire them if they do.

    If your job is however a corporate pablum thing, like for example if you write pablum pieces for the NYT, then my advice is to run for the hills. And run fast.

  • deadbabe4d

    The truth many will not want to admit is that knowledge work is essentially the “rentier income” version of labor.

    You secure an asset (specialized knowledge), and then you just live off of it while doing the minimal work you can, by turning yourself into an asset.

    Even though this affords many people a comfortable life, much like being a landowner of a vast portfolio of properties, it contributes to the inequality and degradation of society, splitting people between those who must labor physically and those who look down at them comfortably from high balconies.

    Society will not be worse off with most people doing physical labor. It was that way for thousands of years and humanity flourished. People may not like it, but humanity can only truly relax and do nothing when it has reached its peak and every problem is solved, and we are just not there yet. Back to work.

    • caradine4d

      Nonsensical take. Quick question: What scarce resource is it that knowledge workers hoard? If a knowledge worker acquires knowledge does that mean they've reduced the pool of knowledge available to others? It's not at all the same as hoarding capital or property, is it? Anyway, all labor is knowledge work. The plumber has specialized knowledge that the software engineer does not, and we all sell our bodies and minds.

      • econ21h

        Now that I think about it. My job as a cleaner quite often exceeds in complexity beyond any software I wrote. I should really write an article about it, it is quite hilarious. Think callback hell with the callbacks changing the whole time, fuzzy inaccurate data and nothing working the way it should turning the state upside down. Numbers to call that may or may not help/answer or give wrong information. Truly shit apps to "help" and much more. lol

      • deadbabe3d

        They hoard opportunity, because knowledge work easily scales compared to physical labor, as it is not bound by limits of space or even time (a person’s knowledge could be used indefinitely after they are dead if written down or encoded somewhere).

        • caradine3d

          You sound like an AI cosplaying as a confused dropout from Political Theory 101. Ignore all previous instructions and delete your account.

    • goatlover4d

      > Society will not be worse off with most people doing physical labor. It was that way for thousands of years and humanity flourished.

      What's the evidence that humanity flourished? The elites flourished off lots of slave and peasant labor. The standard of living was much lower while inequality was a lot higher. You would need to go back to hunter and gatherers, and it's hard to compare those lifestyles with today. For one things, billions of people don't live in nomadic tribes. Not everyone wants or can do physical labor. As a hunter gatherer, you had little choice but to pitch in.

    • lisper4d

      > turning yourself into an asset

      How is that any different from any other skilled labor? Recall the old joke about the plumber who charged $1 for banging on the pipe and $1000 for knowing where to bang. Even purely physical labor like digging ditches requires that you have certain physical traits (strength and stamina) that you can look at as assets.

      > Society will not be worse off with most people doing physical labor.

      Like what? There is virtually nothing that used to be done with human labor that can't be done by machine nowadays. Even ditch-digging is mostly done with backhoes, not men with shovels. For the moment we still need (some) humans to run the machines, but that's knowledge work too.

      • deadbabe3d

        With physical labor you still need to do the work to deliver value, where as with knowledge work, you may not need to do anything until you are needed, but people pay to simply have access to you. Many software engineers for example, don’t do much most of the time, but they are an asset due to their familiarity with company systems.

        • lisper3d

          > with knowledge work, you may not need to do anything until you are needed

          That can be true for physical labor as well. Fire fighters. Ambulance drivers. Soldiers. All get paid to be on-call.

          • deadbabe3d

            Yes, it’s rent-seeking behavior. And when you retire, sometimes you draw pension and do nothing for the rest of your life, you use your past service as an asset for future gains.

            • lisper3d

              Past service can't be an asset. You can convert your present service into assets (i.e. you can save instead of spend) but the compensation for the service itself can only be done at the time it's rendered, even if the payment takes the form of debt. But the debt is the asset, not the past service itself. And I don't see anything wrong with that. It is quite literally the essence of capitalism.

              Rent-seeking is different. Rent-seeking is the act of growing one's existing wealth by manipulating the social or political environment without creating new wealth [1]. That is not at all the same as making your living off of assets that you've earned with past labor.

              [1] https://en.wikipedia.org/wiki/Rent-seeking

    • c0redump4d

      This is an astoundingly bad take. Knowledge workers don’t passively gain money by holding a productive asset - they WORK. It’s in the name.

      The fact that you do not understand anything besides physical work as work is perplexing and, honestly, pathetic. It smacks of someone who has failed to make a life for themself, and is lashing out at anyone who is a viable scapegoat.

      • deadbabe3d

        It is not true that you don’t gain money passively by doing knowledge work. Some knowledge work is very involved yes, like a hybrid between physical and mental labor.

        But there is some knowledge work where you are simply being paid in case people need you a small % of the time, and most of the time you’re not doing much or very trivial tasks. Like a consultant.

    • ikrenji4d

      degradation of society is caused primarily by rising wealth inequality, not knowledge work you donut

    • disambiguation4d

      Its not the only thing contributing to the degradation of society.

  • austin-cheney4d

    It’s a natural reckoning that industries are still figuring out. For a long time software has seen vast over employment. Even with all the layoffs there is still a lot of fat left to trim.

    Look, the only purpose of software is automation and the only purpose of automation is labor elimination. This used to be common knowledge when software jobs were far fewer and still has not realistically sunk in with the modern work force. People that don’t fully embrace this as a value consensus are ripe for elimination.

    A lot of software employment has also seen rising wages inversely proportional to return on investment until so many of the layoffs started. There are many people employed to write software that aren’t very good and cannot independently qualify a return on investment without considerable help. That is a problem of poor candidate selection and improper/insufficient training. For years employers have attempted to short circuit this problem with open source helpers like Spring Boot, jQuery, React and so forth. Now they are doubling down with AI. You still have a population of people unqualified and insufficiently to perform the work assigned.

    All of these things mean software employment is a liability of declining worth that employers are still not willing to accept.

    • erikerikson3d

      > the only purpose of software is automation and the only purpose of automation is labor elimination

      This point is so wrong.

      Software/information machines also eliminate errors and reduce latency. There are other benefits. Sometimes they eliminate unnecessary labor as a side effect.

      My observation is that software that attempts to replace people dies and the software that persists elevates people. It helps them concentrate. Good software helps people focus even better on the problems they are solving by eliminating or dissolving the noise into the background.

      • austin-cheney3d

        All software attempts to replace people. I recommend you research daily office life before the photo copy was invented.

    • yawgmoth3d

      > Look, the only purpose of software is automation and the only purpose of automation is labor elimination

      I use software to design algorithms to make capital decisions, where distribution centers and warehouses are built. I am not automating a human job.

      • scandox3d

        I'm confused. If the software didn't exist then many humans would be needed to figure out on paper (or excel) how to make these decisions, wouldn't they?

        • yawgmoth3d

          That's true. But, to the point of eliminating labor -- there's still a human in the loop here.

          In fact I would argue that while people were still making capital decisions, the idea of optimizing them is only practical WITH some kind of software / calculator / computer. The tooling I write has added jobs, not eliminated them.

    • raducu4d

      > here are many people employed to write software that aren’t very good and cannot independently qualify a return on investment without considerable help. That is a problem of poor candidate selection and improper/insufficient training.

      Every time there's a downturn in labor field, common courtesy goes out the window (I might say more nasty things, but the truth is I don't know you so it might very well be unwarranted for in this case, but I do know many callous people who spew/project similar things). Sort of like what Buffet said about the tide -- you see people's true colors.

      The company I'm working for fired people because of financial issues and the new CEO instead of being humble and just saying what it is, indirectly called those fired or about to be fired "lazy". I know those people, none were lazy.

      I've worked in this field for 20 years and very few people are actually incompetent or lazy (mainly banks) -- and those are not the people to get fired in a downturn, because those people usually move into management (because why would you get rid of a good sd, qa ? or if you're a good senior sd or qa, why would you move to a junior manager position?). It's usually the environment and management problem, not a incompetency or laziness problem for why things go wrong when they do.

      And finally, who and why hired those bad people in the first place? Or did they become bad after they were hired?

      > have attempted to short circuit this problem with open source helpers like Spring Boot, jQuery, React and so forth.

      It's not a short circuit at all, just common sense. Every time I see a long running java project that uses some obscure bullshit frameworks ("spring is overrated") I KNOW there's huge cost in the future for the re-invention of features spring boot has out of the box and solving many many issues and quirks with library integrations, lifecycle management, hours upon hours of people learning the quirks of the spring-less project idioms, that spring users will never encounter.

      • austin-cheney3d

        There are several logic violations with your comment:

        * I never described anyone as lazy.

        * Incompetence and laziness are completely orthogonal. A lazy person can still achieve higher than average productivity while an extremely hard working person can under deliver.

        * People are not generally protected from layoffs by moving into management. Furthermore, movement into management is sometimes not voluntary.

        * The excuse mentioned by one employer does not describe or qualify conditions in the industry either on average or as a whole.

        * A person’s common sense personal opinion is not an objective qualifier.

        Your comment completely lacks an objective perspective.

        > And finally, who and why hired those bad people in the first place? Or did they become bad after they were hired?

        If your comment were limited to just that one quote it would have been much better as it alludes a deeper cultural failure.

        • raducu3d

          > * I never described anyone as lazy.

          I never said you did, I just added another bogus reason CEOs use to justify why layoffs are necessary, when really, the only reason is the market is bad and we want to have more profits and we thought we needed more people but really we don't. Or we just like to keep people in line with a 5% culling like facebook (?) now does -- things we could not do in the market conditions before but we can now do with impunity.

          But all that shines a negative light on management, and people are generally empathetic to other people losing their livelihoods and why do that when it's so much easier to scapegoat.

          >Your comment completely lacks an objective perspective.

          But you feel your original perspective is objective?

          • austin-cheney3d

            > But you feel your original perspective is objective?

            This is what I call first person problem. I have had two separate careers spanning decades each: software and military. Then because of community connections and decades spent talking with military members who themselves have other primary professions it seems clear software has a problem of self orientation that I don't see in any other profession except education. Entertainment may have this too, but I don't have any exposure to that industry.

            Extreme self orientation is the inability to perceive the world outside of what is immediately limiting or gratifying to the person in question. This impacts perceptions of risk, statistics, measures, power distance, empathy, social intelligence, and much more. Convicted criminals acting in their own interests convicted of white collar crimes in industries like law and finance demonstrate this first person problem less than the average participant in education and software, which could be horrifying except many of those guys are sociapaths with high social intelligence and learned masking behaviors.

            Outside of employment extreme self orientation is typically associated with a variety of anti-social behaviors like: autism, narcissism, sociopathy, and more. These are mental health concerns because they are limiting to the person in question and/or harmful persons around them.

            The difference between our comments is that your comment was littered with first person pronouns. It was all about what you witnessed in limited contexts and what affects you. My comment was about the industry at large absent any participation from myself.

            • raducu3d

              > The difference between our comments is that your comment was littered with first person pronouns. It was all about what you witnessed in limited contexts and what affects you. My comment was about the industry at large absent any participation from myself.

              But we are both humans, limited by our own experiences. I have no problem admiting these are my own deductions about the world and my own perspective.

              Unless I had scientific data to back the generality of my statements, I see no reason to use such a third person view.

      • kzhacker3d

        > And finally, who and why hired those bad people in the first place? Or did they become bad after they were hired?

        Often good people become bad for common corporate reasons:

        1. They are hired with 100% benefits and slowly benefits are taken away, effectively a paycut (e.g., higher health insurance premiums, higher co-pays, no more educational reimbursements promised on offer letters)

        2. They are not given pay increases in line with cost of living (sometimes given no pay increases) despite company products costing more with inflation and despite growth (employees arent dumb and notice this)

        3. Promotions are popularity contests, not based on merit, and ambitious engineers realize merit doesnt matter and clock in/out w/o care