95
I Started Programming When I Was 7. I'm 50 Now and the Thing I Loved Has Changed
(www.jamesdrandall.com)
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Follow the wormhole through a path of communities !webdev@programming.dev
I've had this problem with abstractions for the longest time. Of course whenever I say anything negative about abstractions I just get dog piled so I don't usually like to discuss the topic.
I think abstractions as a tool is fine. My problem with abstractions is that most developers I meet seem to only talk about the upsides of abstractions and they never take into account the downsides seriously.
More often then not, I just think people treat abstractions as this magical tool you cant over use. In reality, over use of abstractions can increase complexity and reduce readability. They can greatly reduce the amount of assumptions you can make about code which has many many additional downsides.
Of course I'm not saying we shouldnt use abstractions. Not having any abstractions can be just as bad as having too many. You end up with similar issues such as increased complexity and reduced readability.
The hard part is finding the balance, the sweet spot where complexity is minimized and readability is maximized while using the fewest amount of abstractions possible.
I think too often, developers would err on the side of caution and add more abstractions then necessary and call it good enough. Developers really need to question if every abstraction is absolutely necessary. Is it really worth it to add an additional layer of abstraction just because a problem might arise in the future vs reducing the number of abstractions and waiting for it to become a problem before adding more abstractions. I don't think we do the latter enough. Often times you can get away with slightly less abstractions than you think you need because you will never touch the code again.
I think the design of interfaces is a difficult and subtle art. You have to have a very refined understanding of both of the layers being interfaced, so you can understand where the boundary should go and what shape it should have so concepts don't get split across layers. You also need to have a keen understanding of how the humans using the interface will behave in the future, which is really hard and often impossible. I think that's why interfaces tend to evolve over time along with the tech, because assumptions built into them were either incorrect, or became incorrect (or just confusing) as the technical landscape shifts around them.
Speaking of shifting landscapes, I think one of the fundamental practices of engineering is prototyping: building a thing that does what you think you want, even if it's janky or unscalable or has an insane cyclomatic complexity or w/e. Sometimes building the janky version can lead to insights into how an improved version can be made; insights that would be very difficult or impossible to predict if one tried to build the perfect version on the first go.
This causes some problems in corporate development, because the chance to learn from a model and iterate on it directly is so rare. The vast majority of the time (IME), as soon as the janky version fulfills the client's list, it moves into production and the improvements are, if not written off entirely, put on the backlog of tasks that will never be more important than building out the next new thing. It's still possible to iterate ideas in future new projects, it happens all the time, but it's different than building one thing and refining it in an iterative development cycle over a long term.