• phorq@lemmy.ml
    link
    fedilink
    arrow-up
    171
    arrow-down
    1
    ·
    1 year ago

    ChatGPT just makes me feel like I’m doing code review for junior developers who don’t understand the task… wait…

  • blackbirdbiryani@lemmy.world
    link
    fedilink
    arrow-up
    154
    arrow-down
    6
    ·
    1 year ago

    For the love of God, if you’re a junior programmer you’re overestimating your understanding if you keep relying on chatGPT thinking ‘of course I’ll spot the errors’. You will until you won’t and you end up dropping the company database or deleting everything in root.

    All ChatGPT is doing is guessing the next word. And it’s trained on a bunch of bullshit coding blogs that litter the internet, half of which are now chatGPT written (without any validation of course).

    If you can’t take 10 - 30 minutes to search for, read, and comprehend information on stack overflow or docs then programming (or problem solving) just isn’t for you. The junior end of this feel is really getting clogged with people who want to get rich quick without doing any of the legwork behind learning how to be good at this job, and ChatGPT is really exarcebating the problem.

    • chicken@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      43
      arrow-down
      25
      ·
      1 year ago

      If you can’t take 10 - 30 minutes to search for, read, and comprehend information on stack overflow or docs

      A lot of the time this is just looking for syntax though; you know what you want to do, and it’s simple, but it is gated behind busywork. This is to me the most useful part about ChatGPT, it knows all the syntax and will write it out for you and answer clarifying questions so you can remain in a mental state of thinking about the actual problem instead of digging through piles of junk for a bit of information.

      • CoopaLoopa@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        23
        arrow-down
        2
        ·
        edit-2
        1 year ago

        Somehow you hit an unpopular opinion landmine with the greybeard devs.

        For the greybeard devs: Try asking ChatGPT to write you some Arduino code to do a specific task. Even if you don’t know how to write code for an Arduino, ChatGPT will get you 95% of the way there with the proper libraries and syntax.

        No way in hell I’m digging through forums and code repos for hours to blink an led and send out a notification through a web hook when a sensor gets triggered if AI can do it for me in 30 seconds. AI obviously can’t do everything for you if you’ve never coded anything before, but it can do a damn good job of translating your knowledge of one programming language into every other programming language available.

        • Kogasa@programming.dev
          link
          fedilink
          arrow-up
          26
          ·
          1 year ago

          It’s great for jumping into something you’re very unfamiliar with. Unfortunately, if you often find yourself very unfamiliar with day to day tasks, you’re probably incompetent. (Or maybe a butterfly who gets paid to learn new things every day.)

        • sj_zero@lotide.fbxl.net
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          BIIIIG problem: The last 5%.

          Did ChatGPT just hallucinate it? Does it exist but it isn’t used like ChatGPT says? Does it exist but it doesn’t do what ChatGPT thinks it does?

          I use ChatGPT sometimes to help out with stuff at home (I’ve tried it for work stuff but the stuff I work on is niche enough that it purely hallucinates), and I’ve ended up running in circles for hours because the answer I got ended up in this uncanny valley: Correct enough that it isn’t immediately obviously wrong, but incorrect enough that it won’t work, it can’t work, and you’re going to really have to put a lot of work in to figure that out.

      • el_bhm@lemm.ee
        link
        fedilink
        arrow-up
        17
        ·
        1 year ago

        Just a few days ago I read an article on the newest features of Kotlin 1.9. Zero of it was true.

        Internet is littered with stuff like this.

        If model is correct, you are correct. If model is not correct, you are working on false assumptions.

        • chicken@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          8
          arrow-down
          2
          ·
          1 year ago

          No difference there, either way your information may be wrong or misleading. Running code and seeing what it does is the solution.

      • wizardbeard@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        22
        arrow-down
        8
        ·
        edit-2
        1 year ago

        The more you grow in experience the more you’re going to realize that syntax and organization is the majority of programming work.

        When you first start out, it feels like the hardest part is figuring out how to get from a to b on a conceptual level. Eventually that will become far easier.

        You break the big problem down into discrete steps, then figure out the besy way to do each step. It takes little skill to say “the computer just needs to do this”. The trick is knowing how to speak to the computer in a way that can make sense to the computer, to you, and to the others who will eventually have to work with your code.

        You’re doing the equivalent of a painter saying “I’ve done the hard part of envisioning it in my head! I’m just going to pay some guy on fiver to move the brush for me”


        This is difficult to put into words, as it’s also not about memorization of every language specific syntax pattern. But there’s a difference between looking up documentation or at previous code for syntax, and trying to have chatGPT turn your psuedocode notes into working code.

        • emptiestplace@lemmy.ml
          link
          fedilink
          arrow-up
          19
          ·
          1 year ago

          The more you grow in experience the more you’re going to realize that syntax and organization is the majority of programming work.

          organization, absolutely - but syntax? c’mon…

        • Stumblinbear@pawb.social
          link
          fedilink
          arrow-up
          13
          ·
          edit-2
          1 year ago

          I’m a pretty senior dev and have chat gpt open for quick searches. It’s great for helping me figure out what to Google in the cases where I can’t think of the name of a pattern or type I’m looking for. It also helps quite a bit with learning about obscure functions and keywords in SQL that I can do more research on

          Hell, I use Copilot daily. Its auto complete is top-tier

          • anti-idpol action@programming.dev
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            Copilot is good for tedious stuff like writing enums. But otherwise I more often than not need to only accept tne suggested line or particular words, since in multiline snippets it can do stupid things, like exiting outside of main() or skipping error checks.

        • chicken@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          10
          ·
          1 year ago

          I’ve been programming for decades, I’m not actually a beginner. A mistake I made early on was thinking that everything I learn will be worth the time to learn it, and will always increase my overall skill level. But (particularly as relates to syntax) it’s not and it doesn’t; something I only use once or rarely, something that isn’t closely connected with the rest of what I often do, I’ll just forget it after a while. I greatly prefer being broadly capable of making things happen to having a finely honed specialization, so I run into that sort of thing a lot, there is an ocean of information out there and many very different things a programmer can be doing.

          I think it is an important and valuable lesson to know when to get over yourself and take shortcuts. There are situations where you absolutely should never do that, but they are rare. There are many situations where not taking shortcuts is a huge mistake and will result in piles of abandoned code and not finishing what you set out to do. AI is an incredibly powerful source of shortcuts.

          You’re doing the equivalent of a painter saying “I’ve done the hard part of envisioning it in my head! I’m just going to pay some guy on fiver to move the brush for me”

          More like you’ve coded the functionality for a webapp, have a visual mockup, and pay some guy on fiver to write the CSS for you, because doing it yourself is an inefficient use of your time and you don’t specialize in CSS.

          As for the issue of a new programmer ending up with problems because they rely too much on AI and somehow fail to learn how to model the structure of programs in their head, that’s probably real, but I can’t imagine how that will go because all I had to go on when I was learning was google and IRC and it’s totally different. Hope it works out for them.

        • eclectic_electron@sh.itjust.works
          link
          fedilink
          arrow-up
          6
          arrow-down
          1
          ·
          1 year ago

          TBF that’s how many master artists worked in the past. The big art producers had one master painter guiding a bunch of apprentices who did the actual legwork.

          • Rodeo@lemmy.ca
            link
            fedilink
            arrow-up
            3
            ·
            1 year ago

            And senior devs guide junior devs in the same way. The point is the masters already did their time in the trenches. That how they became masters.

        • 257m@sh.itjust.works
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          1 year ago

          That the exact opposite problem for most people though. Syntax is hard at first because you are unfamiliar and gets more natural to you overtime. Algorithms are easier to think about conceptually as a person with no programming experience as they are not programming specific. If you are an experienced developer struggling over syntax yet breezing through difficult data structure and algorithm problems (Eg. Thinking about the most efficient way to upload constantly updating vertex data to the gpu) you are definitely the anomaly.

        • sj_zero@lotide.fbxl.net
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          There’s a 5 hour interview with John Carmack on YouTube where he talks about transitioning from really caring deeply about algorithms and the like to deeply caring about how to make a sustainable and maintainable codebase you can have an entire team work on.

          Often, a solution that is completely correct if all you’re doing is solving that problem is completely incorrect in the greater context of the codebase you’re working within, like if you wanted to add a dog to the Mona Lisa, you can’t just draw a detailed line art dog or a cartoon dog and expect it to work – you’d need to find someone who can paint a dog similar to the art style of the piece and properly get it to mesh with the painting.

      • Solar Bear@slrpnk.net
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        5
        ·
        edit-2
        1 year ago

        Never ask ChatGPT to write code that you plan to actually use, and never take it as a source of truth. I use it to put me on a possible right path when I’m totally lost and lack the vocabulary to accurately describe what I need. Sometimes I’ll ask it for an example of how sometimes works so that I can learn it myself. It’s an incredibly useful tool, but you’re out of your damn mind if you’re just regularly copying code it spits out. You need to error check everything it does, and if you don’t know the syntax well enough to write it yourself, how the hell do you plan to reliably error check it?

        • _danny@lemmy.world
          link
          fedilink
          arrow-up
          10
          ·
          1 year ago

          You absolutely can ask it for code you plan to use as long as you treat chatgpt like a beginner dev. Give it a small, very simple, self contained task and test it thoroughly.

          Also, you can write unit tests while being quite unfamiliar with the syntax. For example, you could write a unit test for a function which utilizes a switch statement, without using a switch statement to test it. There’s a whole sect of “test driven development” where this kind of development would probably work pretty well.

          I’ll agree that if you can’t test a piece of code, you have no business writing in the language in a professional capacity.

      • blackbirdbiryani@lemmy.world
        link
        fedilink
        arrow-up
        6
        arrow-down
        1
        ·
        1 year ago

        I write a lot of bash and I still have to check syntax every day, but the answer to that is not chatGPT but a proper linter like shell check that you can trust because it’s based on a rigid set of rules, not the black box of a LLM.

        I can understand the syntax justification for obscure languages that don’t have a well written linter, but if anything that gives me less confidence about CHATGPT because it’s training material for an obscure language is likely smaller.

        • ByGourou@sh.itjust.works
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          1 year ago

          Less checking syntax and more like “what was this function name again ?”
          “Which library has that ?”
          “Do I need to instance this or is it static ?”
          All of theses can be answered by documentation, but who want to read the docs when you can ask chatgpt. (Copilot is better in my experience btw)

      • state_electrician@discuss.tchncs.de
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        1 year ago

        ChatGPT cannot explain, because it doesn’t understand. It will simply string together a likely sequence of characters. I’ve tried to use it multiple times for programming tasks and found each time that it doesn’t save much time, compared to an IDE. ChatGPT regularly makes up methods or entire libraries. I do like it for creating longer texts that I then manually polish, but any LLM is awful for factual information.

        • chicken@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          ChatGPT regularly makes up methods or entire libraries

          I think that when it is doing that, it is normally a sign that what you are asking for does not exist and you are on the wrong track.

          ChatGPT cannot explain, because it doesn’t understand

          I often get good explanations that seem to reflect understanding, which often would be difficult to look up otherwise. For example when I asked about the code generated, {myVariable} , and how it could be a valid function parameter in javascript, it responded that it is the equivalent of {"myVariable":myVariable}, and “When using object literal property value shorthand, if you’re setting a property value to a variable of the same name, you can simply use the variable name.”

          • state_electrician@discuss.tchncs.de
            link
            fedilink
            arrow-up
            3
            ·
            1 year ago

            If ChatGPT gives you correct information you’re either lucky or just didn’t realize it was making shit up. That’s a simple fact. LLMs absolutely have their uses, but facts ain’t one of them.

      • anti-idpol action@programming.dev
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        1 year ago

        you can remain in a mental state of thinking about the actual problem

        more like you’ll end up wasting a significant amount of time debugging not only the problem, but also chatGPT, trying to correct the bullshit it spews out, often ignoring parts of your prompt

          • anti-idpol action@programming.dev
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            It might be wrong even if you provide extensive context to make it more accurate in it’s heuristics. And providing extensive context is pretty time consuming at times.

          • anti-idpol action@programming.dev
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            I mean it might be good at helping you when you’re stuck, but sometimes it misses simple issues such as typos and for one issue resolved, it might introduce another if you’re not careful.

    • apinanaivot@sopuli.xyz
      link
      fedilink
      arrow-up
      12
      arrow-down
      7
      ·
      1 year ago

      All ChatGPT is doing is guessing the next word.

      You are saying that as if it’s a small feat. Accurately guessing the next word requires understanding of what the words and sentences mean in a specific context.

      • blackbirdbiryani@lemmy.world
        link
        fedilink
        arrow-up
        15
        ·
        1 year ago

        Don’t get me wrong, it’s incredible. But it’s still a variation of the Chinese room experiment, it’s not a real intelligence, but really good at pretending to be one. I might trust it more if there were variants based on strictly controlled datasets.

        • jadero@programming.dev
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          I have read more than is probably healthy about the Chinese room and variants since it was first published. I’ve gone back and forth on several ideas:

          • There is no understanding
          • The person in the room doesn’t understand, but the system does
          • We are all just Chinese rooms without knowing it (where either of the first 2 points might apply)

          Since the advent of ChatGPT, or, more properly, my awareness of it, the confusion has only increased. My current thinking, which is by no means robust, is that humans may be little more than “meatGPT” systems. Admittedly, that is probably a cynical reaction to my sense that a lot of people seem to be running on automatic a lot of the time combined with an awareness that nearly everything new is built on top of or a variation on what came before.

          I don’t use ChatGPT for anything (yet) for the same reasons I don’t depend too heavily on advice from others:

          • I suspect that most people know a whole lot less than they think they do
          • I very likely know little enough myself
          • I definitely don’t know enough to reliably distinguish between someone truly knowledgeable and a bullshitter.

          I’ve not yet seen anything to suggest that ChatGPT is reliably any better than a bullshitter. Which is not nothing, I guess, but is at least a little dangerous.

          • nogrub@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            what often puts me of that people almost never fakt check me when i tell them something wich also tells me they wouldn’t do the same with chatgpt

        • worldsayshi@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          1 year ago

          The Chinese room thought experiment doesn’t prove anything and probably confuses the discussion more than it clarifies.

          In order for the Chinese room to convince an outside observer of knowing Chinese like a person the room as a whole basically needs to be sentient and understand Chinese. The person in the room doesn’t need to understand Chinese. “The room” understands Chinese.

          The confounding part is the book, pen and paper. It suggests that the room is “dumb”. But to behave like a person the person-not-knowing-Chinese plus book and paper needs to be able to memorize and reason about very complex concepts. You can do that with pen and paper and not-understanding-Chinese person it just takes an awful amount of time and complex set of continuously changing rules in said book.

          Edit: Dall-E made a pretty neat mood illustration

        • Fraylor@lemm.ee
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          So theoretically could you program an AI using strictly verified programming textbooks/research etc, is it currently possible to make an AI that would do far better at programming? I love the concepts around AI but I know fuckall about ML and the actual intricacies of it. So sorry if it’s a dumb question.

          • PixelProf@lemmy.ca
            link
            fedilink
            arrow-up
            5
            ·
            1 year ago

            Yeah, this is the approach people are trying to take more now, the problem is generally amount of that data needed and verifying it’s high quality in the first place, but these systems are positive feedback loops both in training and in use. If you train on higher quality code, it will write higher quality code, but be less able to handle edge cases or potentially complete code in a salient way that wasn’t at the same quality bar or style as the training code.

            On the use side, if you provide higher quality code as input when prompting, it is more likely to predict higher quality code because it’s continuing what was written. Using standard approaches, documenting, just generally following good practice with code before sending it to the LLM will majorly improve results.

      • worldsayshi@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Yup. Accurately guessing the next thought (or action) is all brains need to do so I don’t see what the alleged “magic” is supposed to solve.

    • LemmyIsFantastic@lemmy.world
      link
      fedilink
      arrow-up
      1
      arrow-down
      5
      ·
      1 year ago

      It’s okay old man. There is a middle there where folks understand the task but aren’t familiar with the implementation.

  • EnderMB@lemmy.world
    link
    fedilink
    arrow-up
    85
    ·
    1 year ago

    ChatGPT is banned by my employer, because they don’t want trade secrets being leaked, which IMO is fair enough. We work on ML stuff anyway.

    Anyway, we have a junior engineer that has been caught using ChatGPT several times, whether it’s IT flagging its use, seeing a tab open in their browser during a demo, or simply just seeing code they obviously didn’t write in code I’m reviewing.

    I recently tried to help them out on a project that uses React, and it is clear as day that this engineer cannot write code without ChatGPT. The library use is all over the place, they’ll just “invent” certain API’s, or they’ll use things that were deprecated/don’t work if you’ve even attempted to think about the problem. IMO, reliance on ChatGPT is much worse than how juniors used to be reliant on Stack Overflow to find answers to copy paste.

      • EnderMB@lemmy.world
        link
        fedilink
        arrow-up
        25
        ·
        edit-2
        1 year ago

        One of the dirty secrets at FAANG companies is that lots of people join from internships, and can get all the way to senior and above without ever needing to go through a standard, full technical loop. If you have a formal apprenticeship scheme, sometimes you’ll join through a non-tech loop.

    • Nahdahar@lemmy.world
      link
      fedilink
      arrow-up
      26
      ·
      edit-2
      1 year ago

      The underlying problem is the same, it just became more accessible to copy code you don’t understand (you don’t even need to come up with a search query that leads you to some kind of answer, chatpgt will interpret your words and come up with something). Proper use of chatgpt can boost productivity, but people (both critics of chatgpt and people who don’t actually know how to code) misuse it, look at it as a “magic solution box” instead of a tool that can assist development and lead you to solutions.

    • wizardbeard@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      23
      arrow-down
      1
      ·
      edit-2
      1 year ago

      I’ve got no issues with people using stackoverflow or chatGPT as a reference. The problem has always been when anyone just skims what they found and just paste it in without understanding it. Without looking at the rest of the comments, further discussion, or looking at any other search results for further insight and context.

      I think chatGPT makes this sort of “carelessness” (as opposed to carefulness) even easier to do, as it appears to be responding with an answer to your exact question and not just something the search algorithm thinks is related.

  • Semi-Hemi-Demigod@kbin.social
    link
    fedilink
    arrow-up
    50
    arrow-down
    1
    ·
    1 year ago

    In days of yore, before Google or even Altavista, you could tell the quality of a team by how many O’Reilly books they had on the shelves.

    • corsicanguppy@lemmy.ca
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I should sell mine. Maybe I’ll keep the crab book and the white book, but the latter’s not even an O’Reilly.

  • szczuroarturo@programming.dev
    link
    fedilink
    arrow-up
    40
    arrow-down
    1
    ·
    1 year ago

    I find it to be suprisingly usless compared to classic aproach. But in my case it might be beacuse of the language i work with ( abap ).

    • Skyrmir@lemmy.world
      link
      fedilink
      arrow-up
      49
      arrow-down
      2
      ·
      1 year ago

      It’s not the language. ChatGPT is about as useful as a decent code manual. It won’t actually solve any problems for you, but it can show you the general format for doing so.

        • EatATaco@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          Yeah I’ve used it for boiler plate stuff for things I’ve not done before, but I always then read about what it did and make sure I understand it and where to look further.

          • marcos@lemmy.world
            link
            fedilink
            arrow-up
            7
            arrow-down
            1
            ·
            1 year ago

            Ok, I’ll use the “usually” wildcard to absorb this one.

            Odds are that ChatGPT can help you better with C# than the documentation. It’s also easier to navigate because you only need to know the answer before being able to make a good question, while merely knowing the answer and having a search engine won’t help you find the right Microsoft doc.

        • moonpiedumplings@programming.dev
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          Sometimes whatever you are working with will have outdated or really poor docs, so an advanced internet info aggregator is useful in that sense.

          I started learning nix before chatgpt and it was a nightmare. I had to continually ask for help on discord, of all places, for things that should really be in the docs.

          Chatgpt makes nix easier, except not really because it’s info is outdated a lot of the time.

  • guywithoutaname@lemm.ee
    link
    fedilink
    arrow-up
    36
    arrow-down
    1
    ·
    1 year ago

    I strongly advise not to do that. As others pointed out, it really is just predicting the next word. It is worth learning about how to problem solve and to recognize that the only way to become a better program is with practice. It’s better to get programming advice from real people online and read the documentations for the functions and languages you are trying to use.

    • eerongal@ttrpg.network
      link
      fedilink
      arrow-up
      34
      ·
      1 year ago

      Notepad++ is perfectly fine to code in. With the wealth of plugins it has, it’s pretty similar to vscode in how you can trick it out with all sorts of things it can’t do by default.

      • Kogasa@programming.dev
        link
        fedilink
        arrow-up
        22
        arrow-down
        2
        ·
        1 year ago

        I’m a tolerant person, but come on, man. Between VSCode, JetBrains, (n)vim and emacs, and I can’t think of a legitimate reason to use np++ for development over any of them.

        • MikuNPC@lemm.ee
          link
          fedilink
          arrow-up
          11
          ·
          1 year ago

          It’s super fast in comparison to full IDEs and is easier to use than most editors. I switch between vscode and notepad++ depending on what im doing.

          • DudeDudenson@lemmings.world
            link
            fedilink
            arrow-up
            8
            ·
            1 year ago

            Macros man, being able to record a macro and use it quickly and easily is worth it’s weight in gold when you’re doing something super repetitive that there are no automatic refactors for.

            And i hate the “modern sleek design” culture of making all the options hidden and difficult to reach. Notepad s interface is so fucking clean and usefull.

            I still use intellij because of a lot of other things but quite often I find myself using notepad for specific tasks and it’s such a treat

        • GBU_28@lemm.ee
          link
          fedilink
          English
          arrow-up
          6
          ·
          1 year ago

          If you accept confusing and unsettling peers who watch you screen share and hammer out keybinds that do mysterious things, then you’re all set.

        • tiredofsametab@kbin.social
          link
          fedilink
          arrow-up
          3
          ·
          edit-2
          1 year ago

          I don’t use JetBrains because it’s not free, I mainly use VSCode since it is and works fine, but I would use np++ after that. I spent years working in np++.

          I played with linux in the early '90s, but mostly got started on GenToo Linux years ago and they had people installing Nano when building from the ground up. I grew to like that and never really learned VIM. I did use emacs every now and again, but all of those have lots of unwieldy key combinations that require memorization and don’t work like a lot of other programs people coming from, for instance, Windows would be at all familiar with. The barrier to entry was too high to bother with so it was wine and np++ since I was also still using Windows for work.

          I’ve been forced to use a Mac for work for the last almost-year and still can’t find anything as good as np++. BBCode is as close as I can get and I’m still not really a fan.

        • pirrrrrrrr@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          3
          ·
          1 year ago

          The tiny, tiny footprint and speed to load.

          I would think I’d probably use an IDE if I was coding all the time.

          Heck, I’m only using it because JFE got too old.

          I do have VSCode set up even with the same scheme as NP++… but let’s face it, the most complex things I’m using are PowerShell and Node JS.

        • droans@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          I’ll use it for one-off short scripts. No point in doing the whole shebang for something that doesn’t need it.

  • PrettyFlyForAFatGuy@lemmy.ml
    link
    fedilink
    arrow-up
    28
    arrow-down
    2
    ·
    1 year ago

    I can code a feature faster than i can debug ChatGPTs attempt. so long as it’s in JS

    ChatGPT is better at bash than me though

  • smotherlove@sh.itjust.works
    link
    fedilink
    arrow-up
    31
    arrow-down
    5
    ·
    1 year ago

    I’ve always, always been a documentation-only guy. Meaning I almost never use anything other than the documentation for the languages and libraries I use. I genuinely don’t feel that I’m missing out on anything, I already write code faster than my peers and I don’t feel the need to try to be some sort of 10x developer.

    • alignedchaos@sh.itjust.works
      link
      fedilink
      arrow-up
      18
      arrow-down
      1
      ·
      1 year ago

      Sometimes there are better methods to implement something, and we can learn from others’ mistakes without having to make them ourselves

      • wizardbeard@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        3
        ·
        1 year ago

        How would chatgpt streamline that any better than other documentation and skimming things like programming blogs and stackoverflow?

        • emptiestplace@lemmy.ml
          link
          fedilink
          arrow-up
          6
          arrow-down
          10
          ·
          1 year ago

          It’s not as dumb as you are suggesting. I’ve been programming in various languages since the 80s and I can say with confidence that your take is, at best, absurd. Go spend some time with GPT 4.

          • smotherlove@sh.itjust.works
            link
            fedilink
            arrow-up
            9
            arrow-down
            2
            ·
            1 year ago

            I’m IP banned due to my VPN. If they don’t want my business, that’s fine. I’m not getting off my VPN just to interact with proprietary software.

    • canni@lemmy.one
      link
      fedilink
      arrow-up
      2
      arrow-down
      2
      ·
      1 year ago

      I’ve always, always been a intuition only guy. Meaning I almost never use any thing other than blind guessing on how languages and libraries work. I genuinely don’t feel I’m missing out on anything, my farts already smell better than the rest of my peers and I just don’t feel the need to learn the modern tools of my trade.

      • smotherlove@sh.itjust.works
        link
        fedilink
        arrow-up
        24
        arrow-down
        1
        ·
        1 year ago

        Learning the proper way to do things has nothing to do with being smart. I don’t think I’m smarter than my peers, I just have a longer attention span (thanks adderall!)

        • TrickDacy@lemmy.world
          link
          fedilink
          arrow-up
          3
          arrow-down
          2
          ·
          1 year ago

          Yeah I don’t buy it. I don’t think I’ve ever once used documentation so good I didn’t need to use a search engine at some point. Kind of odd you pretend you never use one lol… You and all the downvoters seem to have some kind of strange complex where you need to feel superior

          • smotherlove@sh.itjust.works
            link
            fedilink
            arrow-up
            3
            arrow-down
            1
            ·
            1 year ago

            I don’t get how you could think that’s what I meant. Every person with an internet connection uses a search engine. I use search engines to find relevant documentation. I assumed that went without saying.

            • TrickDacy@lemmy.world
              link
              fedilink
              arrow-up
              3
              arrow-down
              1
              ·
              1 year ago

              You say you use documentation only. You obviously use stack overflow or other forums too, like the rest of us. But keep on pretending

              • smotherlove@sh.itjust.works
                link
                fedilink
                arrow-up
                1
                ·
                1 year ago

                If you don’t believe me, there isn’t really anything I can do to change your mind. It doesn’t matter to me either way, but no I don’t go on stackoverflow because it’s toxic and unhelpful. I go on forums or IRC if I want to discuss the development of a library.

                Honestly, I am pretty used to the “you can’t do things your way, you have to do them our way or it wont work” shit because practically every neurotypical thinks this way. All I’m saying is there’s a reason I’m faster than my peers. I don’t know how much of that is due to my avoidance of “crutches” but I’m certain it’s nonzero.

        • TrickDacy@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          1 year ago

          The fact that you people pretend to only use documentation like some elitist boyscouts actually does say something about you.

          I don’t believe you, you’re lying, you just want to seem smart. I don’t give a flying fuck if random Internet people think I’m smart or whatever the hell else you’re suggesting. Just flat don’t care. I know there is nothing wrong with using search engines and stack overflow and that we all do it. Pretty weird you all pretend otherwise. Kind of sad really that your ego requires this of you.

  • Mr_Lobster@lemm.ee
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    1
    ·
    1 year ago

    I literally cannot comprehend coding with ChatGPT- How can I expect something to work if I don’t understand it, and how can I understand it if I don’t code and debug it myself? How can you expect to troubleshoot any issues afterwards if you don’t understand the code? I wouldn’t trust GPT for anything more complex than Hello World.

    • mild_deviation@programming.dev
      link
      fedilink
      arrow-up
      13
      arrow-down
      1
      ·
      1 year ago

      Just yesterday, I wrote a first version of a fairly complex method, then pasted it into GPT-4. It explained my code to me clearly, I was able to have a conversation with it about the code, and when I asked it to write a better version, that version ended up having a couple significant logical simplifications. (And a silly defect that I corrected it on.)

      The damn thing hallucinates sometimes (especially with more obscure/deep topics) and occasionally makes stupid mistakes, so it keeps you on your toes a bit, but it is nevertheless a very valuable tool.

      • philm@programming.dev
        link
        fedilink
        arrow-up
        8
        ·
        1 year ago

        That only really works, if the method is self-contained, and written in a language that GPT has seen often (such as python). I stopped using it, because for 1 in 10 successful tries I waste time for the other 9 tries…

    • worldsayshi@lemmy.world
      link
      fedilink
      arrow-up
      12
      arrow-down
      1
      ·
      edit-2
      1 year ago

      You shouldn’t use code that you don’t understand. Chatgpt outputs quite readable and understandable code and makes sure to explain a lot of it and you can ask questions about it.

      It can save quite a lot of effort, especially for tasks that are more tedious than hard. Even more if you have a general idea of what you want to do but you’re not familiar with the specific tools and libraries that you want to use for the task.

      • III@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        1
        ·
        1 year ago

        It’s also wrong a lot. Hence the requirement for understanding. It can be helpful to get through a stretch but it will fuck up before too long and relying on it entirely is a bad idea.

    • philm@programming.dev
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      This.

      If I’m writing something slightly more complex, ChatGPT(4) is mostly failing.

      If I’m writing complex code, I don’t even get the idea of using ChatGPT, because I’m only getting disappointed, and in the end waste more time trying to “engineer” the prompt, only to get disappointed again.

      I currently cannot imagine using ChatGPT for coding, I was excited in the beginning, and it’s sometimes useful, but mostly not really for coding…

      • worldsayshi@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        If you’re already knee deep in existing code and looking for bugs or need to write quite specific algorithms it seems not very useful. But if you for some reason need to write stuff that has the slightest feeling of boilerplate, like how do I interact with well established framework or service X while doing A, B C it can be really useful.

        • oldfart@lemm.ee
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          Also it’s often doing a great job if you paste a stack trace into it and maybe some surrounding code. I used it to fix someone else’s Java code as well as to upgrade some 3rd party Wordpress junk to latest PHP. I barely know Java and stopped following PHP news around version 5.6.

    • Psythik@lemm.ee
      link
      fedilink
      arrow-up
      7
      arrow-down
      3
      ·
      1 year ago

      I haven’t been in web development in over 20 years; thanks to ChatGPT, I was able to get up-to-speed and start building websites again, when in the past I would have never been able to do so.

      GPT is a powerful tool that can allow anyone to do anything if they’re willing to put in the effort. We should be praising it, not making fun of it. It’s as revolutionary as the internet itself.

    • 1984@lemmy.today
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      1 year ago

      Often the code is self explanitory. I understand the code very often, but I still couldn’t write it correctly from scratch. You never feel like that?

      This is how code examples in books works too. You get some code to look at and try to understand it. Otherwise it’s like you would ignore code examples while learning programming.

    • corsicanguppy@lemmy.ca
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      1 year ago

      I use it to give me prototypes for ansible because Ansible is junk. Then I build my stuff from the mishmash and have GPT check it. Cuts a lot of time down that I’d rather be doing any-bloody-thing else with.

      • namingthingsiseasy@programming.dev
        link
        fedilink
        arrow-up
        5
        ·
        1 year ago

        Basically, yeah. Dennis Ritchie wrote the C compiler because he knew exactly what her wanted to use it for and the kinds of code that he wanted to write. Then he went on to write the book that everyone used to learn the language.

        This is true of probably every language, library, framework, etc. The original designer writes it because he knows what he wants to do with it and does so. Then everyone else follows. People then add more features and provide demonstrations of how to use them, and others copy them. It is extremely hard to just look at an API and use that to figure out exactly which calls should be made and in what order. Everyone just reads from the examples and adapts them as needed.

  • Devouring@lemmy.world
    link
    fedilink
    arrow-up
    19
    ·
    1 year ago

    If you’re doing something extremely skillfully, chat gpt will make the dumbest suggestions ever…

    Chatgpt is good for learning ideas and new things as an aggregate of what everyone thinks about it. But as a coding tool it cannot reason properly and has rubber stamp solutions for everything.

    • DudeDudenson@lemmings.world
      link
      fedilink
      arrow-up
      10
      ·
      1 year ago

      Well yes it’s responses are based on what the average of the internet would say.

      I’m surprised it doesn’t constantly tell you to format windows and reinstall no matter what you ask