Uncommon Interview Advice: Ditch the Training Wheels
TL;DR:
Most technical interviews are very different from real-life programming (shocker, I know). While this might be obvious, many people (myself inclusive) don't follow this observation to its logical implication: in a timed interview, besides having someone breathing down your neck, you won't have access to your usual tools or be able to rely on the inefficient habits or workarounds you've picked up along the way.
So it then follows that you should:
- Wean yourself off tools—especially when prepping for interviews or doing side projects. No autocomplete, no starter code—just you, Notepad++, and vibes.
- Be intentional about coding best practices—use intuitive variable names, write clear comments, and take a systematic approach to debugging.
- Talk through your thought process—whether with a rubber duck, a friend, or even yourself. Make verbalizing your logic a habit as you code.
Coding proficiency
Back in my freshman year, my roommate used Notepad to code his programming assignments. To me, this seemed wildly inefficient. Notepad lacked autocomplete, syntax highlighting, code formatting—everything that makes coding smoother. His reasoning was simple, even admirable: he believed that doing so would train him to be a better programmer in the long run, as he wouldn't rely on the crutches of modern IDEs.
Honestly, bro was lowkey cooking, but I couldn't see the vision—I couldn’t imagine a world where I wouldn’t have access to those tools. So, I stuck with my Eclipse (yes, first-year programmer—I know).
Looking back, I realize I was wrong.
Enter the Technical Interview.
In many technical interviews today, you'll probably use some version of CoderPad or Replit. Larger companies have their own internal tools that function similarly. In almost all cases, these tools resemble Notepad more than they do IntelliJ or a fully decked-out VSCode.
In the past, this wasn’t a big deal because you weren’t expected to run your code. You’d write out a solution—sometimes even pseudocode—and call it a day. But recently, at least in my experience, interviewers are increasingly asking you to run your code live.
And that’s when your weaknesses can become glaringly obvious.
If you’ve been leaning on autocomplete and code completion to patch up your gaps, the interview will quickly expose you. Here are a few of my own embarassing moments:
- Forgetting types: Is it
str
,string
, orString
? I don’t know! I’m coding in Python, but my interviewer uses TypeScript, so they can’t help me out, yet they still expect my code to run. Such a small thing, yet it can send you into a spiral. - Forgetting return signatures and other OOP principles. Especially true if you only code with LeetCode since these are already implemented for you. So you forget stuff like:
- Should I put everything in a class?
- Should I create an instance of the class in order to access my method?
- In languages like C and C++, you also run into issues with namespaces (not to be confused with header files!). For example, suppose you want to manipulate an array of strings. You remember to
#include <string>
, but your code still doesn't work. That's because you need to use thestd
namespace to accessstd::string
. Without specifyingusing namespace std;
or prefixingstring
withstd::
, the compiler doesn't recognize it. And the errors aren’t that helpful either.
Because these are very basic things, you end up appearing like a fledgling bird learning to fly for the first time, even if you've been solving LeetCode hards in minutes. As long as you've had a crutch covering a deficiency, you'll feel like a 10x programmer—until the tool is taken away.
This is even more relevant now, in the age of ChatGPT and Copilot, where you can build a full-stack application without explicitly writing any code yourself, simply tweaking what GPT outputs.
While this makes you extremely productive in the short term, understand that the chickens will come home to roost soon enough.
Most people, I’d posit, don’t use language documentation, they simply google the thing they’re trying to do in the language they’re using. Using documentation is often slow because, in some ways, it requires you to know what you don’t know. For example, say I’m trying to convert an array to a string. How would I even go about looking the documentation for that? You’d need to know the data structure and look through its methods to see if it has an implementation for that. And sometimes the method exists in the standard library but not in the data structure itself. So, in most cases, it’s a lot easier to just google “how to convert array to a string.” In an interview, though, you don’t have that liberty. So your options are:
- Master the language inside out so you know these things at your fingertips.
- Get comfortable with documentation—learn to break down what you want to achieve in a way that works well with the docs.
- Rely on the interviewer—depending on the interviewer, they can help, but I wouldn’t count on it. It’s safer to assume that this will be used against you, especially in a very competitive market.
Debugging
If you're anything like me, your debugging process might look like this (I have to differentiate between pre-GPT and post-GPT):
Pre-GPT:
- Look at Error/Exception:
- Helpful Language like Java: Usually shows you where the problem is—easy, go fix that.
- Language like JavaScript/C++:
- Helpful Error (Rare): Solve it.
- Unhelpful Error (Common): Copy error and paste into Google; usually, the first Stack Overflow link has the solution—copy and tweak.
- Trying to do something, but not sure how to:
- Google what you're trying to do -> visit first Stack Overflow / Geeks For Geeks Link -> copy and tweak
Post-GPT:
- Simply copy the code snippet and error into ChatGPT, and it'll (9 times out of 10) give you the fix.
- If you're using something like Cursor, you don’t even need to leave your IDE. You can get everything explained in-house and pull in relevant documentation to give the AI all the context it needs.
In an interview, though? Nope. You’ve got to rely on the docs. So, train yourself to always refer back to them.
Systematic Debugging Process
IRL, you also have access to tools like a debugger, and the classic - “change something, click run and hope for the best”. While these are expedient when trying to meet deadlines and save precious brain cycles, in an interview, you don't have this benefit.
So it behooves you to have a systematic approach to debugging[1]:
- How does the output differ from what you expect?
- Sanity checks: Are variables misspelled? Off-by-one errors? These are often caught by the compiler. Nonetheless, showing the interviewer you can catch them without needing a compiler is always a good thing.
- Make a list of suspects: Formulate hypotheses about what could be going wrong.
- Test systematically: Change one thing at a time, predict how it will affect the program, and run the test. Eliminate suspects until you find the root cause.
Verbalizing Your Thought Process
This one tripped me up a lot, especially after teaching for years (as a TA and at Juni Learning). Sometimes, what’s intuitive to you is not clear to someone else. This is usually because you’re making leaps in logic or assumptions based on experience.
How I’ve improved is by using the rubber duck method. Before I write any code, I’ll write comments about what I’m trying to achieve and my logic, in as clear and concise a manner as possible. I do this while talking out loud. Once you get in the habit of doing this, it becomes much easier to replicate in an interview setting.
Another useful tip: I’ve had success explaining a code snippet to ChatGPT, then asking it to explain it to me as if I’d just taken my first programming class. It can be a great way to test your understanding.
To Summarize:
- Try to rely less on tools. Not necessarily at work, but more so during side projects or Leetcode practice. I personally use Sublime or VSCode without plugins depending on my mood.
- Learn to rely on documentation.
- Develop strong debugging habits, systematically without relying on a debugger.
- Use the rubber duck method to improve verbalizing your logic.
Keep grinding 🫡
Notes:
- This is a great resource