Hacker Newsnew | past | comments | ask | show | jobs | submit | longfacehorrace's commentslogin

Looked at repos of the two loudest users in that thread; either they have none or it's all forks of other projects.

Non-contributors dictating how the hen makes bread.


In general, caving to online mobs is a bad long-term strategy (assuming the mob is not the majority of your actual target audience[0]). The mob does not care about your project, product, or service, and it will not reward you for your compliance. Instead it only sees your compliance as a weakness to further target.

[0] While this fact can be difficult to ascertain, one must remember that mobs are generally much, much louder than normal users, and normal users are generally quiet even when the mob is loud.


Yes, but also... that's like 90% of the interactions you get on the internet?

I don't want to be too meta, but isn't that a description of most HN threads? We show up to criticize other people's work for self-gratification. In this case, we're here here to criticize the dev caving in, even though most of us don't even know what Stoat is and we don't care.

Except for some corner cases, most developers and content creators mostly get negative engagement, because it's less of an adrenaline rush to say "I like your work" than to say "you're wrong and I'm smarter than you". Many learn to live with it, and when they make a decision like that, it's probably because they actually agree with the mob.


I don't actually care what the dev does. That's their prerogative, and it doesn't affect whether or not I'll use the software (I will if it's useful). I think that's the difference between here and a "mob", assuming other commenters think similarly.

I do think it's harmful to cave in, but that doesn't make me think less of the maintainer's character. On the other hand, some of the commenters in the issue might decry them as evil if they made the "wrong" decision.

It's fine to have opinions on the actions of others, but it's not fine to burn them at the stake.


Not just online; priests, CEOs, celebrities, politicians; don't make them happy you're a sinner, a bad employee, hater of freedom, etc.

Anyone with a rhetorical opinion but who otherwise provides little to getting cars off assembly lines, homes built, network cables laid.

In physical terms the world is full of socialist grifters in that they only have a voice, no skill. They are reliant on money because they're helpless to themselves.

Engineers could rule the world of they acted collectively rather than start personal businesses. If we sat on our hands unless demands are met, the world stops.

A lot of people in charge fear tech unions as we control the world getting shit done.


Workers should rule the world but absolutely not engineers specifically. God, i shudder to think of such a world


isn't forks of other projects how you usually contribute code on github


Makes a good block list. Vehemently arguing to block AI


most of the anti-AI community have already migrated their repos from Slophub


Doubt a lawyer actually modified a website.

That's what GPT is for.

Trivial syntax glitches matter when it is math and code.

In law what matters is the meaning of the overall composition, "the big picture", not trivial details a linguist would care about.

Stick to contextualizing the technology side of things. This "zomg no apostrophe" just comes off as cringe.


It's hard to believe that a LLM would make a mistake like this. It's literally called a Large Language Model.


Regular people don't have global reach and influence over humanity's agency, attention, beliefs, politics and economics.


If Donald Trump did this, he wouldn't be criminally liable either.


Nuremberg/just following orders might fly if we were talking about a cashier at Dollar General.

This is a genius tech bro who ignored warnings coming out institutions and general public frustration. Would be difficult to believe they didn't have some idea of the risks, how their reach into others lives manipulated agency.

Ground truth is apples:oranges but parallels to looting riches then fleeing Germany are hard to unsee.


It's claimed Adam Smith wrote hundreds of years ago that (paraphrased) division of labor taken to extremes would result in humans dumber than the lowest animal.

This era proves it out, I believe.

Decline in manual, cross context skills and rise in "knowledge" jobs is a huge part of our problem. Labor pool lacks muscle memory across contexts. Cannot readily pivot to in defiance.

Socialized knowledge has a habit of being discredited and obsoleted with generational churn, while physical reality hangs in there. Not looking great for those who planned on 30-40 years of cloud engineering and becoming director of such n such before attaining title of vp of this and that.


Front row seats to the apocalypse would be metal af.


move to SF. that's the place AI will nuke first


Car manufacturers need to step up their hype game...

New Honda Civic discovered Pacific Ocean!

New F150 discovers Utah Salt Flats!

Sure it took humans engineering and operating our machines, but the car is the real contributor here!


It's not just the keyboard. My iPhone 15 is often so unresponsive I am tapping twice as much.

Example but the issue not limited to web browsing; Safari will do nothing, I tap again, it does the thing, then it does the thing again due to the second tap. I have to tap back to get to where I really wanted to go.


Sounds like the liquid glass animations are so heavy that if the system is busy with anything else for a second then everything simply breaks.

I remember seeing the videos about cpu usage spiking over 40% just to show the control center.

And similarly, even on a Mac I find myself clicking on links and button multiple times, just for things to work. It has a dedicated keyboard, how is it that they messed it up so much that a physical keyboard stops working. It's an interrupt based interface, it takes less than a millisecond to process things, how can someone mess things up so freaking stupidly.


That was basically the whole point of it.

Apple makes money selling hardware; they have a vested interest in making things slower/worse to incentivize people to buy newer hardware.

This is why you can never really trust Apple and also why no matter how bad Windows gets, it's still a better deal because at least you can count on the fact that PC businesses will compete on the hardware front to get your money.

Choosing Apple is a lot like being in an abusive relationship; you can't leave because the switching cost are quite high, so you tolerate a lot more abuse than you would be willing to otherwise.

And this is the reason people try to not rely on Apple software too much; if you do, they truly have you by the balls.


Shortcuts run but often do not trigger all the stages in a pipeline. No issues with same shortcuts prior to installing iOS26. These Shortcuts do not trigger UI transitions. They send data over network.

Sounds like Apple management enabled a quality assurance failure that is fostering so many distractions for users it's turning people against Apple.

Tim Cook handing his replacement a dumpster fire.


Extremely common pitfall in UI engineering. If you treat all input as a queue that's divorced from output, you end up with situations like this.

It's kind of a paradox, but in many cases you need to actually discard touch inputs until your UI state has transitioned as a result of previous inputs. This gets extremely nuanced and it's hard to write straightforward rules about when you should and shouldn't do this. Some situations I can think of:

- Navigation: User taps a button that pushes a screen on your nav stack. You need to discard or prevent inputs while the transition animation is happening, otherwise you can push multiple copies of that screen.

- Async tasks: User taps a button that kicks off an HTTP request or similar, and you need to wait on the result before doing something else like navigation or entering some other state. Absolutely you will need to prevent inputs that would submit that request twice. You will also need some idempotency in your API design to handle failure/retries. A fun example from the 1990s is the "are you sure you want to make this POST request again" dialog that Web browsers still show by default.

- Typing: You should never discard keystrokes that insert/delete characters while a text input field is focused, but you may have to handle a state like the above if "Enter" (or whatever "done" button is displayed in the case of a software keyboard) does something like submit a form or do navigation.

Essentially we're all still riding on stuff that the original Mac OS codified in the 1980s (and some of it was stolen from Xerox, yes), so the actual interaction model of UIs is a mess of modal state that we hardly ever actually want to fully realize in code. UI is a hard problem!


This analysis ignores the fact that the user experience has regressed from a previous version which didn’t have these issues.

So it’s not like some longstanding industry-wide UI issues they’ve ignored forever, it’s that Apple has introduced new tradeoffs or lowered their quality standards to the point that some users feel their experience has worsened.



Okay, how long is the debounce window? Where in the input pipeline do you debounce (obviously not immediately on keystrokes)? Will debounce work for long-running requests, which are event-driven and not time-driven?

I have seen, far too many times, naive approaches like wrapping all click handlers in a "debounce" function cause additional issues and not actually solve the underlying problem.


To clarify - I am not stating that simple debouncing is the solution to all the issues you're identifying. I agree with you that handling some of them can be very complex. I just shared the article as a pointer to a broadly similar concept that can be used to help communicate the gist of what you're talking about.


Just to correct a common error, nothing was stolen from Xerox. Apple gave Xerox stock (which they later sold too early) for demos and access to the Parc work on Smalltalk and GUIs.


Two things:

Decades of reiterating to myself it's none of the verbose semantics and socialized sensory memory my brain tries to feed me (imposter syndrome for example), but is just biochemistry habit from exposure to social memes.

4-5 mile walk as often as possible. Any less and a sense of reset does not kick in.


[dead]


Tested shorter. Takes 2-2.5 miles to trigger a muscle relaxation response in core and shoulders. Another 2+ miles in that state to feel the same release in extremities and head/mental space.


A generation deified via TV appearances and book sales for decades.

They look online and see humanity moving on.

They look in the mirror and see less time ahead of them than behind them.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: