Hacker News new | past | comments | ask | show | jobs | submit login
Habits I've developed for fast and efficient programming (cprimozic.net)
108 points by Ameo on Oct 24, 2021 | hide | past | favorite | 102 comments



I would add 2 things: 1) use libraries that are stable and well documented, 2) use things that work well out-of-the-box.

For 2 I'd like to give you the example of Grafana. To get it up and running all you have to do is install an RPM and start the service. That's it, you can immediately start playing around with the software. The out-of-the-box configuration is fine for 90% of the use cases. Upgrades are equally smooth as well, everything is done for you. Just install the newer version and restart the service. Database is upgraded automatically. By default an SQLite database is used to store everything, but if for some reason that becomes a bottleneck you can upgrade to MySQL, at which point also some more advanced skills are needed. But that's kind of the point I'm trying to make, it works well OOTB but can be tailored to fit outside of its default configuration.

I've grown to loathe software where much manual setup is needed. Creating databases, importing the schema and bootstrap data, going over a myriad of configuration options because the defaults don't work OOTB, having to configure things that could easily be auto detected by code, requiring dependencies that are not available on your distro, moving files around and setting permissions yourself... All feels needlessly complex, isn't the computer supposed to work _for me_?

Over time I've started to feel that the OOTB experience is directly proportional to the quality of the software. Using tools that just straight up work and drop those that require intensive tinkering to get going has definitely improved my work experience.


Yeah I am also a big fan of using things as close to the default as possible. Highly customizable tools are great, but often I feel it’s a better time investment to learn how to use the standard configuration well. Then any thine you sit down in an unfamiliar dev environment you will generally be able to get up and running quickly without having to remember all the things you had to do to set up your tools.


Pretty much 95% of this list is why Kotlin is my primary programming language now. It's more productive because it accomplishes almost everything on this list out of the box on IntelliJ IDEA. The only bits I generally need to add on are some Gradle plugins to handle linting (spotless or kolinter) no-friction Docker images (jib) and overtime amassing a list of very high quality libraries for specific tasks (Clikt for CLIs, Jackson for serialisation, etc).

Couple all this together you get incredibly productive out of the box experience that is very easy to onboard new team-members onto. Pairing with this setup is also great, Code With Me is constantly getting better (pairing tool built into IDEA), being able to teach newbies keyboard shortcuts in IDEA to make them more productive etc.

I think the latter stuff is probably more important to me now as I mostly take on lead roles, my own productivity has always been high but being able to make that more scalable has been immensely more valuable.


The thing I’m not a huge fan of with the IntelliJ model is that it’s super tooling-dependent. I would rather work with a set of loosely-coupled tools and automate my own workflows.


To be honest I used to be the same way. Used desktop linux (gentoo then arch), tiling wm (xmonad), 3k line vimrc with every plugin you could want, tons of scripts to automate gdb/valgrind/etc.

Then something happened, I'm not sure what. Maybe I just got old but I no longer derived value from it being -my- setup. I just wanted whatever got the job done most efficiently.

These days I'm ultra boring. I run macOS everywhere, I still use zsh and a fairly pimped shell, but minimal vimrc and IDEA has replaced the vast majority of my dev environment. It doesn't have any of the personality it once did and tbh that doesn't bother me in the slightest.


I had the same experience. One day I realized that I no longer enjoyed doing sysadmin work, especially on my own machines.

I began to loathe complicated config files that require reading manuals to figure out how they work. Examples: SystemD, NixOS, Bazel, and BCL/GCL. Even worse are tools that silently ignore errors in config files. Examples: Gulp, ESLint, and nearly every other tool written in JavaScript.

Now I use a single machine with macOS and leave nearly every setting at the default. I don't even use Brew. I prefer to use tools that come with their own installers and updaters.

I think this change happened when I finally found work that difficult for me and took all of my energy. I had no energy left over to waste on sysadmin work.

Also, my brain began to fill up with knowledge of systems programming, languages, and libraries. Recalling details of random config files takes more effort than before.

I think my brain has a hierarchical memory system with four types of long-term memory:

0. Knowledge available instantly

1. Knowledge retrievable with 2-3 seconds of concentration

2. Knowledge that is unavailable upon request, but usually pops into mind 1-2 minutes later.

3. Knowledge that I can retrieve only with an associated memory or idea.

When I was a teenager, my technical knowledge fit into my tier 0 memory. Then I earned a degree and worked some years and my knowledge grew to fill tiers 0 & 1 and some has been relegated to tiers 2 and 3.

I take and keep many photos because I can use them years later to retrieve forgotten memories. Maybe I will start taking screenshots of my computer to use in a similar way.


Yeah it’s not so much about being my setup, I think its more that I’ve gotten used to working this way and I don’t want to learn a whole big tool which might not do exactly what I want.

With most of my projects, I have a couple of scripts to take care of the tasks relevant to that project. When I start something new, I just copy them over from a similar project and edit where needed. Super light weight, and basically will work until the end of time.

For work sometimes I need to adopt someone’s highly tooled workflow, and then you have to get the editor license, find the right plugins, make sure you’re on the same version and so on. And sometimes you dust off an old project and something broke in this big magical dependency chain and you have to do surgery to figure out what it is.


Changing the IntelliJ boot JDK from JBR version 11 to Azul Zulu (or Prime aka Zing) version 15 is the biggest productivity win I've made this past year. The difference is night and day.


I have been thinking of doing something similar, what differences has it made? Is it faster at indexing/build system model refreshes?


Literally everything, from startup to indexing to window / menu painting to font zooming with the mouse wheel (the last one on Linux is worthless on JBR 11).

I am very annoyed with JB for keeping customers stuck working in molasses. Changing from JBR 11 to Azul Zulu 15 puts the "Jet" back in JetBrains!


Kotlin feels like it could take over Scala at some point.

Until recently I've been very happy with Scala support in IntelliJ. Stuff just works or at least works for Scala 2.12 and 2.13

I tried to kick tires with Scala 3 and well it is not ready (warnings about experimental support).

Worst of all Intellij is not actually compiling Scala 3.1 but 2.13.6 still. (despite build.sbt having scalaVersion := "3.1.0")

You start messing around with manually adding and removing jars and you are in for a bad time.

Dealing with setup just saps productivity especially if you have to work on other languages.

First bad experience with Scala and Intellij.


Don't they have slow compilation times? At least my experience with Scala.

I'm used to TypeScript mostly.


I would love all this, but I detest the JVM.


I'm using Visual Studio 2019 on Windows, and Visual Assist addon. All mentioned things are working flawlessly for C#. Most of them work for C++ too.


Why?

My gut reaction used to be similar due to bad experience with it in university but now I realise I was just ignorant.


I think it’s more emotional than rational. The whole feeling around Java and the JDK is just proprietary.

When I want to use Go, Rust or Node I just go to the website and download it. When I want to do anything with Java I first have to agree to a license agreement.

Then the whole thing of working with Kotlin and Maven just feels off (I tried). Like people working in those languages work under a completely different paradigm from me.


I think JVM has come a long way, largely thanks to having the entire enterprise market incentivized to optimize the hell out of it, but at the end of the day it’s still an un-needed dependency.


There are certain applications when it's far from an unneeded dependency. Namely building applications that are extensible by third parties, JVM allows very simple runtime class-loading and has the class loader hierarchy to make this good. If you want to build a plugin system in most compiled languages you are stuck with either dynamic linking which is unrestrained and much more likely to cause whole program failure or IPC, which imposes overhead.

Then you have portable applications. If I want to run on many architectures and platforms the JVM allows me to do this easily, no need to worry about cross compilation or even multiple binary distributions.

So yeah. Is it always necessary if you are always going to target the same platform as your dev environment and never load code dynamically? Probably not. Is the tradeoff (higher memory usage) generally worth it? I think so. Not for all cases, for these I do think you have to go with a compiled language with better memory usage characteristics but that is about it tbh.

Previously the JVM also sort of sucked at latency sensitive workloads unless you were very careful not to upset the garbage collector. This is now a non-issue. ZGC and Shenandoah outclass the best latency optimised collectors in languages like Golang.

So really that only downside remaining is memory overhead, I don't actually think that one can be eliminated - it's likely to remain the "can I use JVM for this or not" lynchpin for the foreseeable future. (ignoring things like GraalVM native images because I don't know enough about them to speculate on if it can substantially lower memory usage)


I did not think about runtime extensibility, but that is a good use-case.

At some point in the future, this might be solved neatly in a standards-based way with native WASM support


Thing is WASM is just trading one VM (JVM) for another. You could argue this is good because of baggage/support in JVM for more "dynamic" languages is out-dated but I think what will likely happen is your initially clean and fast WASM JIT will eventually succumb to a similar fate on the back of needing to support a wider range of guest languages than just Rust/C.


I deploy things on the JVM running on two architectures.

The "write once run anywhere" concept is still very useful, when required.


You can also "write once run anywhere" with compiled languages. But you don't pay a virtualization penalty.

The only advantage of a VM is that the actual executable artifact is platform agnostic.


With Hotspot, hot areas of the code are compiled, lessening the virtualisation overhead.

We compile some C libraries and use them via JNA. It takes time to create the cross-compilation environment with all the correct versions of the libraries they depend on (and the libraries that those libraries depend on).

In comparison, if a library works on Java version x, we can include it and debug it running on the JVM as necessary.


> but at the end of the day it’s still an un-needed dependency.

What are the alternatives you prefer to use?


Compiled software.


Most compiled languages that have a minimal runtime need careful coding for memory safety. Rust attempts to solve this problem, at the expense of a huge initial learning curve.

Go compiles to a static binary, has great cross-compilation tooling, and provides a runtime with memory safety. But many programmers find Go's abstraction capabilities too primitive.

GraalVM provides an AOT compiler for Java [1]. In practice, this has proven to be a niche usecase [2].

[1] https://www.graalvm.org/reference-manual/native-image/

[2] https://openjdk.java.net/jeps/410


These days, articles on the subject of programming productivity are extremely hit and miss. Some articles, I agree with almost everything the author says and others like this one, I just feel 'meh' because everything mentioned seems unimportant to me. These things do add some value yes, but why aren't critical things related to system architecture, high cohesion + loose coupling, keeping interfaces simple and keeping the code closely aligned with the business domain mentioned? The costs of getting these things wrong compounds over time; that's why they're orders of magnitude more important than fast typing and auto-completion.

If I had to guess the author's background, based on the article, I would assume that he/she has a corporate background and very little experience building systems from scratch end-to-end. It's only when you start building systems from scratch 'end-to-end' that you actually learn what is important in programming. Sadly, developers typically cannot get this experience while working for a corporation. Most corporations are too compartmentalized to allow individual employees to see how everything fits together into the big picture of software development.

Unfortunately, because of their high profile, corporate developers get a lot more attention in the industry even though they often have tunnel vision.


If you don't even know the basics, why do you think you can talk about more advanced things?

First, learn the basics of the trade: reading and writing code efficiently and painlessly... then you can think about high level architecture or whatever makes you feel more important. I see a lot of people who think they know the former but they can't even find a definition without clicking through some GUI menus, which takes 10x more time than if they'd just learned a couple of shortcuts. This is what is really going to compound over time, as you do this hundreds of times a day.

Besides, the author addressed your criticism in the second paragraph: _There's a lot more than just writing code that goes into being an effective software developer..._


Sure, nothing against this perspective and glad that this line was mentioned. My main point is that this kind of article doesn't deserve to be front page HN since it mostly distracts people away from more important things. I find these 'productivity porn' articles to be overly superficial and limiting to the true progress of developers... I miss articles like those written by Martin Fowler, Alan Kay and others which tried to tackle the big problems. I feel that a lot of valuable programming knowledge from the past is being swept under the rug. The software industry seems to be getting dumbed down over the past decade.


I think you're still missing the point. Knowing system architecture is not more important than knowing how to read/write code efficiently, specially when you're not designing systems most (if not any) of the time. If you can't do that latter, you most likely can't do the former... this article is good advice for those who need it... once you know those things, you will have a much easier time learning higher level concepts and software architecture - and I, unlike you, find that those are actually the large majority of articles as they're, paradoxically, easier to write as you can bullshit your way through buzzwords and make empirically unverifiable outlandish claims about "microservices", "serverless", "blockchain" or whatever else may make you sound smart, without actually providing anything of real value.


Can you give some links to those articles? My reason is... I'm om mobile and kinda sleepy.



> Using code auto-formatters

I took, and still sometimes take, a part in discussions of adapting or improving autoformatting for some language we use.

It is incredible to think about programmers who have/will grow up entirely with autoformatters in all their tools. To them, the idea of manually formatting their code will seem as “ancient” and “backwards” as assembly programming feels to me. That’s a hallmark of true technological progress right there.


This has been said countless times before, but it's a real shame we don't have yet have good solutions for separating content and presentation of code.


Is this (especially the last sentence) meant sarcastically?


No, I’m dead serious. Sure, the magnitude of progress is not as big as say moving from horses to cars, but it is similar in how people perceive the new vs the old.


Interesting. Personally I didn't see many changes. We had indent before, now it's more common, mostly because of git diffs, and of course some languages got more B&D by design. But I wouldn't say that there have been huge benefits.

Heck, if we're talking language-based tooling, I'd rate the influence of JavaDoc higher.


I think hot code reloading can be very effective. This extremely reduces the feedback loop, which can be very important for productivity.

Unfortunately, I rarely was able to make this work in my setups, except long time ago when I used Visual Basic. Then I worked with C++, mostly game development using SDL. The last couple of years I wrote Python, mostly for machine learning with Theano and then TensorFlow.

The startup time of an initial prototype might still be fast enough (couple of seconds) that you don't care too much and you are still productive. But as complexity adds up, and you add other slowdowns like slow NFS or whatever, this can reach minutes, and this already is annoying. But most development I do would probably not really allow for hot reloading, or I don't really know a good way. This is anyway not really common practice for Python, and whatever you try in that direction will probably be unstable and/or non-Pythonic.

And then, you have deep learning research itself, where your feedback loop is at minimum one day, but often more like a week.

I remember that I saw screencast of notch developing on some games, where he intensively used hot code reloading on some functions which defined the behavior or look of some game entities. This looked like an extremely productive and powerful tool. Just within half a second or so, he saw the result of his code change. He just played a lot around until the behavior was nice.


One of the best changes in the last ten years (imo) has been the widespread adoption of code formatters. Particularly those officially supported by the language.

It makes reading easier and writing faster. And no more time wasted on pointless formatting debates.


I like (and actively use) code formatters, but honestly I don't understand why people make such a big fuss about its impact. Weirdly formatted code impacts maybe 1% of my "reading code"-ability — at least after I accept "this is a bit messy" and stop being irrationally annoyed.

> And no more time wasted on pointless formatting debates.

There will always be pointless debates: Naming variables, deciding which linters to enable, using reduce() vs. a loop, classes vs functions and so on. This is a cultural problem which occur because you're not focusing enough on the things that actually matter ("is the code correct?"; "can we safely ship this to production?"; "what's the long-term effect of introducing this abstraction/feature/capability?").

I love thinking about code as art, and I have numerous side projects which explores this, but when it comes to shipping real features on a deadline we need to create something which works, not something which evokes a good feeling inside me when reading the code.


> I don't understand why people make such a big fuss about its impact

I don't know, ask them? We here are on board with using them. ;)

I've met such problematic people. You can't reason with them, it's almost a religion.

But when you sell standardized tooling to your manager plus emphasize how this will reduce internal team friction, you can elegantly eliminate them from the equation.

I did it before and I plan on doing it again. We are here to write code and be productive, not to discuss aesthetics.


Oof yes I remember the first time I joined a dev team with pedantic tastes in terms of code review. It’s amazing to imagine the amount of collective time which has been wasted on trivialities over the years.


I can't fathom how many companies filled people's brain with stuff like that. "good practice"


I am doing all of these things and totally agree, so maybe my top habit will appeal to you too. It's using the number keys to switch between apps. I don't waste time Cmd-Tabbing around between PyCharm, Terminal, Chrome, Slack, and Sublime Text; I just tap Opt+1, 5, 2, 8, 9. Basically as soon as I've thought of what program I want to be in, I am in it. You may switch among your top ten apps hundreds of times a day; this will be your most used shortcut.

On Windows this is built in: Win+1 switches to the first app on the taskbar. On Mac I use Snap -- https://apps.apple.com/us/app/snap/id418073146?mt=12.


That's really good - I'm going to have to try this! Thanks


Pretty much agree about everything except copilot. Short term gains using copilot is good but long term it will make your brain inferior since you are by default searching for an autocomplete response from copilot. Your brain is literally an autocompleter. You are getting paid a lot because you are are very good at it. Loosing that edge by relying on an AI to do that for you and not practising your skill would make you worthless in long term.


This sentiment echoes Socrates' famous position regarding the written word:

>If men learn this, it will implant forgetfulness in their souls. They will cease to exercise memory because they rely on that which is written, calling things to remembrance no longer from within themselves, but by means of external marks.

Both his point about writing and your point about Copilot are likely true in part or in whole. Ultimately it doesn't matter — the technology is here to stay and its adoption will be determined by the utility it provides its user.


I have respect for Socrates but this was not one of his good quotes. It reeks of "get off my lawn", very strongly.

Many studies show writing things down (on a paper medium, not phones or any electronics however) is very helpful for gaining focus and regaining productivity. We're not storage media; our brains are immensely creative machines but our memory preservation is inferior so we found a good workaround.


I don't know. Remebering is one thing. Making Mircosoft do the thinking for you is another thing. Both are diffrent.


> Loosing that edge by relying on an AI to do that for you and not practising your skill would make you worthless in long term.

If you bank on your brain being the better autocompleter, I got very bad short to mid term news for you.


The senenence you just wrote was autocompleted by your brain. The thought automatically came in your mind. Even solving math prblems are autocompletion. The brain automatically generates the thoughts. You think you generated it. But in reality it's just an autocompletion process that was trained through your lifetime and the generations before through evolution.


You're projecting. I don't program much of anything by habit -- I stop and think, store snippets and choose. My brain doesn't Ctrl+Space through sentences like a preacher going through a well-worn illustration. I stop, consider, discard, and rework.

Many people have more pattern-forming brains than I do -- everyone notices before me when something has happened two weeks in a row or something they expected to happen didn't happen. My brain just doesn't try to predict.


> I stop and think

thinking is auto completing by your brain. You don't create thoughts. It appears automatically. Sam Harris explains it perfectly.

https://www.youtube.com/watch?v=_FanhvXO9Pk


Don't have 1.5 hours to watch a video, but strongly disagree. Autocomplete selects from a list of previously-selected choices or choices that are made available by the context you're in. "Thinking" purposefully varies the context each neuron is firing in so that it has a chance to make a connection that's _different_ from all the ones it could complete automatically.

There are people who talk to find out what they think, and those who must think before they can talk. You sound like the first type telling the second everyone is that way.


> There are people who talk to find out what they think, and those who must think before they can talk. You sound like the first type telling the second everyone is that way.

The sentence you just wrote, it came to your mind automatically. The thought appeared. You think you created it. But did you had any other choice? Could you have refused to not have that thought come to your mind? No right. That's the whole point. The thouths keeps coming in your mind. You can choose to ignore some and focus on some. But the brain keeps generating it. Mostly based on the environment and information you are consuming.

I would recomend taking a few minutes and watching the video. I an happy to engage in a healthy debate once you watched it. He explains it much better than I do. If you have ever heard Sam Harris talking you know for sure that he is not someone who keeps saying out whatever comes to his mind. Also he is a nueroscientist.

By autocomplete I was talking in the context of how GitHub copilot works. It doesn't merely recomend code snippets. It modifies the snippet according to context and all like human brain.


That's what folks used to say (and some still say) about IDE autocomplete in general. And it is the case that when I type without all that IDE-goodness, I type slower (eg is it `new URL().searchParams.get`? Or `new Url()`? Or `.params.get?), but I don't think it makes my brain "corrode". If I use one of these APIs frequently enough, I will learn it from the autocomplete because of the frequent repetition. If I don't, I don't see the need to memorize it.

This lets me stay focused on the actual problem I'm solving; the higher level problem. My job as programmer is not an autocompleter -- if it was I'd be doing something else.

I think your hypothesis is valid, but when comparing the impact of IDE autocomplete on my experience, it seems unlikely. We'd have to try copilot to be sure though. Note I think the situation is different for beginner developers and professional developers.


I too avoid using Google services not to dull my internal something.


Learning to type is easily his best suggestion here. Not just type, but touch type including the symbols. If you ever wondered why vi is modal and uses normal keys for navigation rather than cursor keys you won't wonder any more after you learn to touch type.

It doesn't require that much practise either, a month of solid practise, 30 minutes a day will get you over the hump easily.

I would recommend getting a full size keyboard for this instead of the cramped, flat garbage of a laptop. Decent keyboards can be had at charity shops for a dollar.


Vi uses normal keys rather than cursor keys because the keyboard it was developed on had arrows printed on those keys, and no cursor keys.

People continually want to attribute this to some rational design consideration, but it’s just post-hoc reasoning. If vin were authored 10 years later it would use arrow keys and there’s no reason to stick with this terrible UX.


I think it's more of a lucky accident. There really is something to be said for hjkl scrolling. I enable it in as many apps as I can because I find it genuinely more comfortable than arrow keys. And that's not because I'm some unreformed grognard, I was introduced to vim only a few years ago.


I just have trouble buying it. Competitive gamers earn their living navigating directionally using a keyboard. If placing the navigation keys in a line on the home row were really so much better they would all be doing it by now.


Competitive gamers aren't constantly switching between navigation mode and typing mode, and they use only a small fraction of all the keys on the keyboard.

Also, if their experience is supposed to be indicative, HJKL should be replaced by WASD which is apparently much more efficient.


Yeah I agree, wasd would be an improvement, or ijkl. hjkl is an anachronism and makes no sense.


Gamers use WASD for navigation because their right hand is on the mouse. Programmers try to minimize mouse usage. It's a different set of constraints.


But if the linear key layout is better, why isn't it asdf?


Uh.. gamers don’t use arrow keys..?


The point i am making is that they use an inverse t key configuration, not a flat line of keys which doesn't make sense


That’s more historical than anything. Especially because the S key is usually not used in WASD: instead of walking backwards, a combination of mouse+A/D is used. If you look around you’ll see some pros use ASD with S replacing W. In MMO PVP where more keys are necessary for abilities, SDF and ESDF are also a thing.


Do you really scroll using the keyboard? I always use the trackpack, which is just between my hands as I type, and I find it incredibly slow to use the keyboard for navigation unless it's just a few lines up/down... with the trackpad, without moving my hands more than a few centimeters, I can go anywhere in a file instantly.


Is the trackpad built into your keyboard?


No, I use the laptop all the time, even when on big screens. Tried using Apple's separate keyboard and trackpad, but I found it didn't work as well.


Its on the home row, starting position for your right hand. Farting about with the arrow keys or the mouse takes you right away from where you can start typing again. Its much faster if you learn to touchtype.


Keybr.com is great for this. I've been touch typing all my life, but when I started programming, I noticed that typing brackets, parens and a bunch of other symbols really slowed me down. Great to practice those as well.


Switching to Dvorak on a Qwerty layout is really effective. It’s way more punishing to look down at the wrong keys than to not look down. Not for the impatient though!


Cannot agree with this enough. All the dumb keyboard layout cargo culting aside, this is why I recommend people learn Dvorak. It has been hugely successful for everyone I’ve gotten to make the transition.


Some more:

Have a deterministic work environment that can be set up and torn down in a single command.

Use standard names in build tool configurations (buildtool clean, buildtool lint, buildtool build, buildtool test, buildtool run, etc).

Set up your project such that there are only 6 steps to getting everything working:

- Commands to download

- Commands to install required support items that can't be automated

- Single command to install everything that's needed in the right places / start your deterministic work env

- Single command to build the project

- Single command to run the tests

- Single command to run the live app

Default to running all of one thing, with options to run a subset (i.e. run ALL tests if not specified otherwise).

Be an adequate typist (40wpm or so). If you're doing software development right you'll be spending around 10% of your development time typing, so after a point typing speed won't help you anymore.

Know how to use your debugger. There are some problems that are simply easier to solve with a debugger than by other means.

Know when to use what debugging technique.

Set things up such that everything gets autoformatted automatically.


+1 for zsh-autosuggestions. It is the most important plugin I use because all I have to do is remember the first 2 characters of a command to get the suggestion and autofill it on the command line. It saves so much time and frees up space in my head for more important things to remember.


The two largest productivity boosters for me this year were maintaining a code snippets library (I use obsidian for this) and a tools file that I import into most of my projects; the file includes functions that I use frequently, eg. conversions between various date time formats, etc.


> In weakly typed languages like TypeScript

I guess you mean JavaScript?

In regards to speed, I really really recommend Vim. Macros are sorely missing in a lot of editors, and they are really a great way to skip a huge amount of repetitive work.


No, I think he means Typescript. Typescript doesn’t have (or even tries to have) a sound type system. It’s good to know its limitations and try to avoid code that will blow up despite passing type checking.


Can you give some examples of what you use macros for? I've found I don't need macros in an editor that supports (1) multiple cursors, and (2) multi-file find/replace with regex groups.


Now that I think about it, I'm not sure I can come up with any non-contrived example (maybe this [1]). I guess what's good is that it fits right into the rest of Vim very nicely, so I can just edit like normal, instead of fiddling with regexes.

[1] https://youtube.com/watch?v=FXCitlsA7eQ


What a fun example! That's exactly the sort of stuff I use multiple cursors/regex for. Due to a bit of luck (the refs only appear once, and only appear in order of mention), that specific example ends up being quite a bit easier and faster with multiple cursors (and I believe his example also relies on that). For the more general case, multiple cursors wouldn't cut it though, so I'd have to jump out and use code. And multiple cursors definitely wouldn't scale to multiple docs!

For fun, I did that same example with multiple cursors + vs code: https://www.youtube.com/watch?v=H1uvNbtqJGk


What kind of editor/IDE made for programming is missing macros?? Must be a really shitty one if it can't even do that.


Neither Visual Studio nor Xcode have them.


VSCode, I think?


Use the IDE as a structural editor, not a glorified text painter.


The dev mode only functionality in an app is a really great one - one thing I like to do is put in keyboard shortcuts to automate filling forms, populating state, doing quick navigations to nested parts of the app, etc.


This is great. I've found "find file" to be the most useful tool. It's usually one of the first things I point out during live-coding as soon as I see someone click through the source tree.


Indeed. IDEA has find by filename, find by class and find in files. Couple this with it's quick file switcher which keeps recent files in a stack allows you to quickly move between files you have recently edited (and jump back from something like "go to definition") really increases speed of code exploration.


Yep, anyone still using tabs to navigate between files: stop doing it! Learn to use the recent files switcher plus the general find-file, find-type shortcuts etc.

First thing I do on a new installation of IntelliJ or VSCode: remove the stupid tabs.

EDIT: emacs doesn't have tabs, obviously as it doesn't need them, but some crazies have been pushing to make the tabs package installed AND active by default so newbies can feel more familiar with it... that's incredibly backwards: newbies should learn to not rely on inferior methods of file navigation and use the appropriate tools for the jobs, as a recent-files-switcher that's present in every editor I know... in emacs do this:

   (require 'recentf)
   (recentf-mode 1)
and bind `'recentf-open-files` to some shortcut.


This is a great list! Unfortunately I find a lot of developers are very stuck-in-the-mud about a lot of these things. For example using type hints in JavaScript / Python, ensuring your build system doesn't break IDE features, keeping compile times low, etc.

I've found it really hard to persuade some people that these things are really critical, even though it seems almost so obvious that it's hard to know how to explain it.

Like, how would you persuade someone that they should comment their code.. or use descriptive variable names?


I’d like to add Fig terminal to this list. It offers autocomplete with hovering documentation to explain what flags do. It will even search npm and autocomplete package names for you.


Have you added copilot to your workflow?


Yeah! I talk about that near the bottom of the post.

I'm a really big fan of it; it unironically feels like a bona fide AI assistant.


I saw that. Sorry, I didn't realize you had actually posted this yourself. I meant to direct the question at other readers.


Alternative title: Habits I've developed for fast and efficient rat race.


Isn't everything a rat race if you boil it down enough?

Working for yourself? Rat race Working for others? Rat race

The human reward system works in a way that doing something repetitive, that is just novel enough and just challenging enough makes us feel great.

There's no such thing as a non rate-race life. The point is enjoying it. Otherwise you end up feeling dread and existential anxiety.

Life is meaningless, and at the same it's meaningful. It truly depends on your perspective, and a bit of self delusion.


>> There's no such thing as a non rate-race life.

There is. If you never met a Buddhist monk, I highly recommend to.


It takes off a lot of frustration of the rat race. You can focus and what you actually want to get done. You may then chose to spend your time on more ratracing, or doing other stuff.


It's perfectly fine to enjoy a day job. I truly love mine. I enjoy working at it, and I enjoy the freedom it gives me to do other stuff after work.

I enjoy the monetary reward but also the act of solving hard problems every day.


> It's perfectly fine to enjoy a day job.

It is perfectly fine.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: