Hi,

My question certainly stems from the imposter syndrome that I am living right now for no good reason, but when looking to resolve some issues for embedded C problems, I come across a lot of post from people that have a deep understanding of the language and how a mcu works at machine code level.

When I read these posts, I do understand what the author is saying, but it really makes me feel like I should know more about what’s happening under the hood.

So my question is this : how do you rate yourself in your most used language? Do you understand the subtilities and the nuance of your language?

I know this doesn’t necessarily makes me a bad firmware dev, but damn does it makes me feel like it when I read these posts.

I get that this is a subjective question without any good responses, but I’d be interested in hearing about different experiences in the hope of reducing my imposter syndrome.

Thanks

  • lmaydev@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    1 month ago

    I’ve been using c# since .net 2 which came out around the turn of the century (lol)

    I’d happily call myself an expert. I can do anything I need to and easily dive into the standard library source code or even IL when needed.

    But even then there are topics I could easily learn more on particularly the very performance focused struct features and intrinsics.

    I’ve found LLMs to be super useful when you have a very specific question about a feature. I use bing ai at work so it sources all its answers and you can dive into the articles for more detail.

    Programming is a never ending learning journey and you just have to keep going. When you get something you don’t fully understand to a deep dive there are always resources for everything.