zbyte64

joined 2 years ago
[–] zbyte64@awful.systems 0 points 1 day ago (1 children)

If it gets it wrong the first time I rarely reprompt. I know I can get it to fix it, but it's usually faster for me to do it because I already figured out where and what to do the fix. Low key think it's just a ploy to get us to burn more tokens. Sure correcting it means it writes a few lines to the memory file, but it's only a matter of time before it trips over that context as well.

[–] zbyte64@awful.systems 2 points 1 day ago (3 children)

I have similar problems whenever I send it to investigate a bug and the local runtime is inside a container. It cannot reliably translate paths without the help of an IDE. Hell, it even occasionally mangles API paths if I have it prefixed elsewhere in the codebase (despite having Claude.md etc, your context needs to be pure for it to be reliable). Having it fix a Dockerfile is comically bad.

[–] zbyte64@awful.systems 0 points 2 days ago (1 children)

Any luck with integrating platform.io? Have a esp32 project but VSCode can't provide type hinting with it's main c++ extension that is used by platform.io.

[–] zbyte64@awful.systems 9 points 2 days ago* (last edited 2 days ago) (3 children)

In my experience there are three ways to be successful with this tool:

  • write something that already exists so it doesn't need to think
  • do all the thinking for it upfront (hello waterfall development)
  • work in very small iterations that doesn't require any leaps of logic. Don't reprompt when it gets something wrong, instead reshape the code so it can only get it right

The issue with debugging is that it doesn't actually think. LLMs pattern match to a chain of thought based on signals, not reasoning. For it to debug you need good signals in your code that explicitly tell what it is doing and the LLMs do not write code with that level of observability by default.

Edit: one of my workflows that I had success with is as follows:

  • write a gherkin feature file describing desired functionality, maybe have the LLM create multiple scenarios after I defined one to copy from
  • tell the LLM to write tests using those feature files, does an okay job but needs help making tests run in parallel.
  • if the feature is simple, ask the LLM to make a plan and review it
  • if the feature is complex then stub out the implementation in code and add TODOs, then direct the LLM to plan. Giving explicit goals in the code itself reduces token consumption and yield better plans
[–] zbyte64@awful.systems 1 points 2 days ago* (last edited 2 days ago)

More offended being called a lib really. And I seriously don't think that joke was homophobic, but comedy is in the eye of the beholder so it is whatever you say it is.

[–] zbyte64@awful.systems -3 points 2 days ago (2 children)

He's not exactly straight....

[–] zbyte64@awful.systems 1 points 2 days ago

Maybe that's the next strategy for Dems to try to win over Republicans

[–] zbyte64@awful.systems 2 points 2 days ago* (last edited 2 days ago)

Fewer cars does not mean fewer cars on the road. Autonomous cars means more cars driving on the road at any given time as now you have empty cars going from person to person. It's less safe from a total VMT perspective. And they sure as hell will use this as an excuse to not give us mass transit.

[–] zbyte64@awful.systems 1 points 3 days ago

Might be both. Tell Israel it's to help, but make records to make the case to exit the war.

[–] zbyte64@awful.systems 1 points 3 days ago* (last edited 3 days ago)

Maybe they're "Trauma Bo~~nd~~mbing"

[–] zbyte64@awful.systems 1 points 3 days ago

I want to know if sandhexen is sandwich or "hexed sandwich" but I refuse to look it up because "cursed sandwiches" is head canon now

[–] zbyte64@awful.systems 5 points 3 days ago (2 children)

Not necessarily. Same number of people are doing the same amount of travel by car, but now instead of the car sitting at parking lot it is driving off to do another errand. More efficient in terms of fewer cars needed, but overall it's more vehicle miles and more congestion.

 

A critical and funny critique of an AI written song.

view more: next ›