An Interview with Myself

 

This was the literal response from my "interviewing myself" prompt on Meta.ai

If I'm going to advocate for behavioral questions on steroids as an interview technique, I may as well turn my gaze inward and act as my own test subject.  I'm astroturfing a bit because it's a conversation with myself, but it still shows the potential of the method.

[Can you describe a recent project you've been working on?]

Sure.  I recently started a project to create an online travel agency with the assistance of Generative AI tools.  

[Neat.  What do you plan to do with the project?  How does it differentiate itself from other online sites like Expedia or Booking.com?]

I don't intend to take it live.  It's a proof of concept to show where GenAI can be useful and where it still has significant limitations.

[I see.  What have you discovered so far?]

I've already had some experience with Gen AI, so there were a few things I expected going in, but GenAI as a coding assistant is a powerful tool.  It's great for code completion, especially when I'd otherwise need to spend significantly more time searching for the appropriate API call.

[As a software engineer, how much do you view it as a productivity enhancement vs. a potential replacement for aspects of your job.]

It's definitely an enhancement.  Even on its best days, it's still only right about 80% of the time.  It will often make nonsensical suggestions, but they're usually close enough that it still saves me the effort of needing to do in-depth API research.  In that sense, it's useful as an oracle.

[What do you mean by the term oracle?]

An oracle is a familiar concept in computer science as a logical construct that can respond 'yes' or 'no' to specific questions.  For instance, in mathematical proofs, it can tell you if a particular program halts or continues indefinitely, but it can't explain the details.

Coding assistants are like this - you can test their output quickly to determine if it's useful for your purposes.  They're great for composing code with complex, nested data structures - so things like lists that contain dictionaries that contain lists.  

It's not uncommon for me to spend 30 minutes sorting through the mechanics of nested data structures.  In these cases, the assistant composes the code for me in seconds, and I can verify that it creates the right output in an additional minute or two.

[What else have you noticed using GenAI?]

It's worth asking it for basic strategic guidance.

[Do you have an example?]

Yeah.  Asking it things like "I'm interested in starting an online travel agency, what steps do I need to consider?" or "Can you help me design the home page for my travel site?" can bring valuable insights.  Not so much because it produces novel advice.  Many times it produces a lot of awful or impractical suggestions.  But, it catalogs things well and makes you consider concepts you may otherwise have missed.  Again, it's useful as an oracle.

[Do you have an example of when it made you consider a new concept?]

Sure.  It suggested that I need search functionality on the home page.  That's 'no-duh' type advice and it didn't provide sufficient context to recognize the more extended search functionality a travel site needs vs. the basic needs of a blog.  It just sees 'site' and spits out homogenous advice.

But, it did make me consider whether or not a more universal search mechanism was possible.  And, when asking about libraries or APIs that could assist with building universal search it suggested Natural Language Processing and the spaCy library in Python.

End of Mock Interview Question

Hopefully, I've demonstrated how a simple question can open up a range of responses.  It requires both the interviewer and the candidate's focused engagement, but that's a necessity anyway.  A candidate who blindingly rushes through questions - even if the answers are satisfactory - is someone I'd have second thoughts about hiring.

Did they complete all of the technical questions because they're that adept, or have they seen these questions before?  If they've seen the questions prior, you're not giving them a chance to demonstrate their creativity or critical thinking skills.  

More open-ended questions give them the ability to show the type of co-worker they'll be.  And the choice of branches from the main question is nearly limitless.  I could've asked all of the following as further prompts - 

  • Which LLM or LLMs are you using and why?
  • What programming languages are you using?
  • Describe the basic architecture of your site.  What known trade-offs are you making?
  • What would you add to the site if you had more time or money?
  • What are you leaving out of your feature set due to the time constraints?  How would that ultimately cause problems if released to production?
All of these questions help the interviewer determine the necessary skills in software engineering.  Someone may point out that none of this demonstrates how good a coder a candidate may be, but that's largely irrelevant - 
  • Knowing more about how to compose parts of a system so it's reliable and maintainable is far more important in software engineering than coding.
  • If you have solid fundamentals, but your coding style is a little shaky, it's easy to correct that style so that it conforms to the company's standards and helps you become a better coder.
  • There is absolutely no consensus on what good code is.  I often half-jokingly compare software engineers to building contractors - regardless of the previous installation (or existing code), the next person to come along will grumble about how bad the last person's work was.  
    • In addition, the clever code that someone produces in an interview is unlikely to be representative of what's needed to maintain the company's existing codebase.
Until next time my human and robot friends.

Comments

Popular Posts