Hacker Newsnew | past | comments | ask | show | jobs | submit | jpalomaki's commentslogin

Sometimes easy performance trick is to split the CTE to separate queries, put the results to unlogged temporary tables and add whatever indexes the next step needs.

Obviously makes only sense for stuff like analytical queries that are not running constantly.


Worth underlying the OLAP versus OLTP divide you are speaking to on the close, there.

An issue that has arise for me in some situations is that for more expensive/reporting queries we point to a db replica, where temporary tables are not an option.

Somebody took a deeper look at Claude Code and claims to find evidence of Anthropic's PaaS offering [1]. There's certainly money to be made by offering a nice platform where "citizen developers" can push their code.

From Astral the (fast) linter and type checker are pretty useful companions for agentic development.

[1] https://x.com/AprilNEA/status/2034209430158619084


I wouldn't be surprised if Vercel were bought by Anthropic/OAI (but maybe it would be too expensive?)


No no - SpaceX/xAi must now buy Vercel so that we can deploy our bloated Next apps to space.


Next now renamed to Xext.


At least in space there is lots of space and no heat /s - I'd love for Next to exist in a vacuum


Nothing is too expensive. It will be a bidding war.


Home scale example with strawberries: https://www.youtube.com/watch?v=8LIhx0yoM7s


A good solution for memory would help with stickiness. But it's a hard thing to crack.


Both Anthropic and OpenAI are working hard to move away from being "just" the LLM provider on the background.


Facebook is lacking access to the interesting data. If you are in the Google ecosystem then your private and business life is likely already there.


Kagi is using Google search behind the scenes. I think that’s why it felt so easy to switch to.


That tracks with the 'we use everybody and curate optimal results' model they've got going on, but I wouldn't be changing the search habits of decades if I didn't mean to actively reject what Google search has turned into. So, not a good way to justify a paying-them model.


Would be quite handy for gesture control. When wearing thick gloves you need to take them off to operate the current AirPods.


This was a solved problem in the 1st and 2nd generation of AirPods with tap controls[1]. I'm still surprised that they removed that feature in favor of pressure, although now that I'm reflecting more on it, I wonder if it's part of Apple using their manufacturing and engineering as a moat[2]. i.e. Tap controls are relatively easy, so once wireless earbuds became commodities, they had to figure out some way to differentiate themselves.

That said, as someone who does pottery (messy hands), wears gloves/hats (stuff in the way), and has relatively poor fine motor control, I guess I welcome any solution that doesn't mean getting clay or cold air in my hair/ear.

The battery consumption and latency of the IR cameras will be interesting though. Too sensitive, and you'll eat up your battery. Not sensitive enough, and UX suffers.

1: https://support.apple.com/en-us/102628 2: https://news.ycombinator.com/item?id=45186975


The pros have Find My -support and you can even ping the earbuds separately via the app.

Also the new model can regonize when you fall a sleep and stop the media. I think it works, but I'm not sure how quickly it detects the sleep.


Mine are older and support Find My, but only when they’re out of the case. If I can’t find my case when they’re in it, I’m stuck. Does pro do anything for that?


Yes it does! Each ear separately and the case itself.


Now that we have seen this can be done, the next question is how much effort it takes to improve it 1%. And then the next 1%. Can we make consistent improvements without spending more and more compute on each step.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: