Jump to content

Recommended Posts

Posted (edited)

Loads of people are going to be very, very, very triggered by this topic, but it's exactly those people's point-of-views that I'm trying to understand better.

Let me first set the scene:

You are browsing the ModDB, and you see a cool new mod. You open up the mod, but of course it's 2026 so you're immediately on the alert for whether or not it is AI.

Edit: This is not about AI Art. There has already been a discussion about that, that went nowhere. So kindly go read that forum post if you want to say something about AI art in mods.

Question:

What constitutes "AI use" in mods, i.e what level of AI use is acceptable to you?

  1. A simple example: I google "vintage story json patching how to use addmerge". I read the AI summary and apply what I've learned there to my mod. I state as much in an AI Use Disclaimer on the mod page. Is this mod now AI slop?
  2. Another example: I clone the vsessentialsmod, vssurvivalmod and vsapi from github, and then use AI to ask questions about the codebase to understand how it was done in vanilla, then apply that knowledge to my own mod. Is this mod now AI slop?
  3. One more: I don't know C# syntax all that well, so I ask e.g Cursor "write me a for loop skeleton" and then use that for loop as a starting point for my own code. Is this mod now AI slop?
    1. The for loop in question, for those who don't know:
    2. for (int i = 0; i < 5; i++)
      {
          Console.WriteLine(i);
      }
  4. Second to last one: If you answered no to any of the above, then at what point would you feel it is required of the author to use an AI Use Disclaimer in their mod's page?  
  5. Last one: if the author has an AI usage disclaimer in their mod page, do you even bother to read it or does that immediately disqualify the mod as AI slop, even if the disclaimer says "I used AI to translate banane (german) to banana (english)"?

 

 

 

Edited by Grym7er
Added clarification
  • Like 1
Posted (edited)

Not quite your intended target audience, but I'll start anyway:
1. No
2. No
3. No
4. As soon as people start treating AI not as a helpful tool, but as a replacement of their own effort (or to an extend, as replacement for other people's effort but this would be, again, more about AI art.) In your examples you have always emphasized that AI was used as a singular step in a longer process at the end of which you still do the work yourself. That's perfectly fine and the proper use AI should ever have. The moment people start "vibe coding" and come to a result they have no f-ing clue about cause they didn't know coding before and now they have AI generated code they couldn't even understand if they wanted, that is AI slop and definitely needs a disclaimer. After all, a real developer can understand and fix the mistakes the AI will inevitably make (speaking from lots of experience.) Some nobody that doesn't know the first thing about coding is at the mercy of the big random number generator that an AI is to spew out actually functioning code that is solid enough to not break apart immediately, like monkeys on a typewriter, will never be able to find problem unless they show themselves and will never be able to to anything more about it than go back to said AI and ask "It's not working, can you fix it please".
5. Yes because people being honest enough about this is so rare that actually proper AI disclaimers are such a rare sight, I am intrigued whenever they exist.

If anyone wants a way more indepth rant about why this is my opinion on the topic let me know, but knowing the frailty of most people's sanity when it comes to the dreaded, controversial topic of "AI use" I'd rather save myself the time and energy otherwise.

Edited by Rainbow Fresh
  • Like 5
Posted
18 minutes ago, Rainbow Fresh said:

The moment people start "vibe coding" and come to a result they have no f-ing clue about cause they didn't know coding before and now they have AI generated code they couldn't even understand if they wanted, that is AI slop and definitely needs a disclaimer.

I pretty much agree with you on all counts, except that I think this line has a bit more nuance to it. With the way AI is going, you do not need to be able to understand code to be able to fix bugs, up to a certain (and presently, fairly limited) point. Stuff like cursor does the actual debugging for you, if you are able to clearly and concisely explain what the bug is, when it happens, how to reproduce it and sometimes even (depending on how well you understand game design) what you think the issue could be, in many cases the AI can find and solve the bug. With these simple systems, debugging is often a case of understanding the problem, and not necessarily understanding how to fix it. But then again, that kind of understanding generally develops from understanding how these systems works at a lower level, so it's a bit of a catch 22.

So I think vibe coding a simple mod is totally fine, achievable and maintainable (if you can foot the bill for vibecoding), but beyond that, rather learn how to do it yourself.

For myself, I vibe coded a mod (but I do programming in my job), which kinda got me hooked, so now I'm learning it the right way : )

Thanks for the response.

  • Like 1
Posted
1 minute ago, Grym7er said:

Stuff like cursor does the actual debugging for you, if you are able to clearly and concisely explain what the bug is, when it happens, how to reproduce it and sometimes even (depending on how well you understand game design) what you think the issue could be

Or in other words, "The AI can fix the bug for you, if you do the work of the AI." Which is exactly the point I meant. While we an argue long and hard about if putting your time and effort into properly telling the AI to do the actual work for you instead of just doing the work yourself, at the end of the day my core point still stands; as long as you do work that is aided by AI, it is most likely not "AI slop". In your example you still need a basic understanding of things - of what you are doing, what the AI has done, how AI prompting works in general and what common error vectors are. Just instead of specifically knowing C# syntax and all 1000000 functions from all 5000 libraries out there you can use, you let AI do the actual "putting it into words". You still need to understand what you are doing enough to "teach the AI" and enough understanding of the result to be able to tell "what's wrong". That is ok. Debatable on an individual level, but generally ok. If you use AI to do things you are utterly unqualified for in the first place - then it becomes not ok. If all you can do is "Ok ChatGPT, I want a Vintage Story mod that lets me spawn mobs" and then go "No that didn't work, here is the log file, no clue what's wrong" 100 times until it does work - that's AI slop. You have no ownership of the result. You have no expertise on the matter. Whatever code the AI generated, you couldn't maintain it to save your life, so as soon as your AI subsciption ends it's dead in the water. That is the AI's work, not yours and putting it out as yours is "AI slop" and worthy of criticism.

  • Like 7
Posted

I feel sort of the same way as Rainbow Fresh.

1. i have mixed feelings about it.

2. no.

3. also mixed feelings about this one, but leaning towards no.

Im generally distrustful of the google AI summary, but i might let it slide if its researched further and not just followed blindly. 

4. i would say as soon as any AI is involved, or at least as soon as its used for anything that are somewhat important to the mod like models, code and music/sounds.

As for "vibe-coding" and the like, thats a hard no for me. 

First of all, if you dont know how YOUR code works, thats a big problem if WHEN something breaks, Second of all, im not going to risk my save i have put tens or even hundreds of hours into for a mod someone used 7 hours total to "make" with a prompt moderately rephrased 20 times until the program can open without fatal errors. 

I think  its kind of acceptable to use AI in small amounts IF YOU ARE A BEGINNER (IMPORTANT) when making a mod, but personally i would prefer 100% human made code much MUCH MUCH more than some concoction made of scraped github code repackaged by claude/chatGPT/copilot.

5. I would likely skip the mod except if AI was used to translate stuff, thats fine.

tl;dr i would hugely prefer completely human made mods.

 

  • Like 5
Posted (edited)

If I made whole mods using AI, I don't have a moral obligation to tell anyone. Especially if its free to download.

 

Can anyone honestly tell me why a mod made by real person any better than a mod made by AI? And give me a reason beyond "AI BAD"?

 

I've used AI to make artwork for a few mods I've put on NexusMods. I'm not an artist, no one is paying me to make said mods. I'm definitely not going to be paying  an artist to make artwork for something I don't even get paid for. 

 

Sorry I didn't answer your questions one by one, but this can be a blanket answer in the form of my own question:

As long as the mod works the way it should, then what does it matter how it's made?

Edited by KahvozeinsFang
  • Wolf Bait 2
Posted
14 minutes ago, KahvozeinsFang said:

If I made whole mods using AI, I don't have a moral obligation to tell anyone. Especially if its free to download.

You may not feel morally obligated, but more and more places in the world certainly legally require you to.

  • Like 2
Posted (edited)
20 minutes ago, KahvozeinsFang said:

Can anyone honestly tell me why a mod made by real person any better than a mod made by AI? And give me a reason beyond "AI BAD"?

Yes. As someone who has published one (1) AI-developed mod. I like to explore AI tools to keep an eye on them, but I usually come away with nearly the same understanding every time.

The problem with AI mods is what you pave over in this line:

20 minutes ago, KahvozeinsFang said:

As long as the mod works the way it should, then what does it matter how it's made?

The problem is that it's far more likely that an AI-authored mod doesn't work the way it should. AIs frequently hallucinate and non-determinstically just wet the bed. The one AI-developed mod I've published was a lottery win. For chunks of my daily programming work, I have to explicitly disable even AI auto-complete in my IDEs because they don't understand my style, my language, my situation, or my environment and constantly make basic mistakes. The pull of the training data is too strong in large swaths of niches out of the mainstream. Even safely inside the mainstream, we're still getting headlines with AI agents wiping production databases. Mods are low-stakes, but it still sucks to deal with poorly written software that was churned out carelessly and never reviewed by a human.

There is, theoretically, a "right" way to do AI-assisted development. By gluing it to strict guardrails, giving careful, explicit instruction, and reviewing all output rather than smashing in a prompt and vibecoding it. But without any sort of disclaimer, it's impossible to know. So it's uncool to waste people's time and potentially screw them over through your own carelessness by not at least disclosing it. And without an explicit disclosure, AI thumbnails become the next best thing to hint at this possible undisclosed carelessness and the threat of wasted time.

Besides all that, there's plenty to gripe about the social ramifications of AI (even just specifically in regards to programming) that aren't just "AI bad" but I'm going to cap myself at 2 paragraphs.

Edited by Diff
  • Like 5
Posted (edited)
1 hour ago, Diff said:

The problem is that it's far more likely that an AI-authored mod doesn't work the way it should.

You haven't answered the question. What you've done is flip the question and turn it into an debate about non working mods. I'm not interested in non working mods, because we all know what we do with mods that don't work, we uninstall and we move on. Same thing I would do for any mod that didn't work.

Its like: "oh my car doesn't work properly, therefore cars are bad"

 

 

What I'm interested in hearing about are PROPERLY working mods, and the argument some have on why they shouldn't be used.

 

All I've ever heard is the "AI BAD" argument. So please write as many paragraphs as needed to explain that to me. I'd love to be enlightened. 

 

1 hour ago, Rainbow Fresh said:

You may not feel morally obligated, but more and more places in the world certainly legally require you to.

Yes, maybe in China where they have to disclose any use of AI. But we aren't talking about "any" use. Its mods specifically. 

 

For most regions, you are not obligated to disclose the use of AI if it is used only for "efficiency" or "background" tasks like writing code. And most mods purely are code. 

Edited by KahvozeinsFang
Posted (edited)
  1. No, reading an AI summary and applying what you learn is no different from reading Stack Overflow or a wiki page. The AI was just a search and summary tool. It's often wrong anyway.
  2. No, using AI to navigate a codebase is no different from asking a senor developer "how does this work" or "what does this do". You still have to understand it yourself and apply that knowledge yourself. The work is yours.
  3. No, there are only 2-3 ways to do for loops. The syntax is pretty rigid. AI didn't create anything, it just replicated the standard.
  4. Disclosure isn't required because, functionally, it doesn't change anything. If the mod is a soulless blob of code, you'll know the moment you look at it. If it isn't, then what does it matter how it was made? I wouldn't worry about it until Anego Studios says you have to.
  5. I don't judge mods based on how they were made. I judge them based on whether the dev took the time to actually create something. The tool the dev used is inconsequential. Writing the IL codes by hand to replicate a for loop doesn't make the mod better. It just means you spent a needless amount of time doing something that I could have done by typing "for" in Visual Studio and mashing the TAB key twice to complete the auto-suggestion shortcut.

On the broader question of AI use in modding, I do understand the drive to push back. AI has made it almost trivial to flood any creative space with hollow output, be it writing, drawing, or coding. That erodes trust in everything around it, including the things that were designed with love and care by a human. That's a legitimate frustration. However I think there is a tendency to "throw the baby out with the bathwater", so to speak.

I'd like to gently suggest that the frustration is being aimed at the tool rather than the lackluster behavior. The actual problem isn't the use of AI, but how it's used, whether the user was actually engaged with the tool or just blindly turning wrenches until something worked. Those are entirely different scenarios and conflating them means you're keeping a gate that blocks people with a real vision and minimal know-how from making that vision a reality and creating something with potential.

Photography went through this. There was a time when darkroom skill was the measure of the serious photographer; the hours spent developing the film, blindly dodging and burning the image into the photopaper and then developing it only to find that something went wrong and the whole process had to be redone. Then digital happened and then Photoshop and then Lightroom and then phone cameras that put all the photo settings at the users' fingertips. At each step, people called it the death of "real" photography. At each step, the tools and the medium produced work, real work, worth looking at, made by people who cared about the craft. The tools changed. The craft adapted. Darkroom photography is still a thing and some people still prefer it, btw. And there are darkroom photographers who produce junk that either collects dust or ends up in the recycling bins.

But the train didn't stop for photography and it's not stopping for coding either. The modders who figure out how to use the AI tools and use them well, not to replace their better judgement but to extend their capabilities, to spend less time fighting the syntax of a for loop and more time getting their ideas into code form... these are the modders that are going to keep shipping great work. And you can see it in action if you only know where to look.

For what it's worth, my day job has me writing code by hand. The tools they want us to use are old and more often than not I'm having to correct mistakes that could easily have been fixed if someone had taken the time to just outline what the code did and fix the logical errors. An AI tool could do this in minutes. And the less time I spend fighting bugged code, the more time I can spend getting my actual ideas out into working form. At the end of the day, I'm tired, my brain is tired, and all I want to do is play VS, Minecraft, or some other fun game while my brain recovers in the background. That usually leaves me with a couple of hours at the end of the day to try to write my own ideas into code form. If an AI tool can make that happen faster, then... I'm going to use it. Not because I'm lazy. Not because my ideas are uninspired, but because the alternative is not making the mod at all.

I wouldn't want that.

My co-author wouldn't want that.

The users of my mods wouldn't want that.

90% of it is hand-coded anyway. All the AI did was dig through the VS codebase to find method, field, and property names, show me how to connect to them, and point out goofs, glitches, or gotchas in my code. The "worst" case scenario is I ask it the question, "Is there a better/cleaner way to do xyz" and it gives me a few options that I choose from and try to implement. There is no other way I could have figured out that the __state of a Harmony pre/postfix pair can be any object, even one that is custom defined in the code to pass any amount of data between the the prefix and postfix, and thereby eliminating the need for my code to rely on sensitive transpilers that break every time the game updates anyway.

TL;DR: AI is a tool like anything else. Judge the work, not the workflow.

Edited by Teh Pizza Lady
just applying some non-AI polish because I type faster than I think sometimes.
  • Like 4
Posted

What constitutes acceptable use of AI in mod development is going to vary based on the individual. Some might be okay with heavy AI use in code but not for artistic assets, some might be okay with using AI to generate artistic assets but not code, some might be okay with both and some might be okay with neither, etc. 

9 hours ago, Grym7er said:

A simple example: I google "vintage story json patching how to use addmerge". I read the AI summary and apply what I've learned there to my mod. I state as much in an AI Use Disclaimer on the mod page. Is this mod now AI slop?

Depends on who you ask. I would say no, and also say this is a case that doesn't really need a disclaimer either since it's not an AI writing the code or generating other assets. In this case, you're the one putting in the effort to look up information and then writing and testing the code to make sure it works. That's the important part, since putting in that effort also means you're more likely to catch errors, or at least fix them when bugs do show up.

 

9 hours ago, Grym7er said:

Another example: I clone the vsessentialsmod, vssurvivalmod and vsapi from github, and then use AI to ask questions about the codebase to understand how it was done in vanilla, then apply that knowledge to my own mod. Is this mod now AI slop?

Again, depends on who you ask. I wouldn't say this is slop, for similar reasons as above. You're putting in the work yourself to understand the code and create the stuff yourself--the AI is simply a tool that's helping you look up the information needed.

 

9 hours ago, Grym7er said:

One more: I don't know C# syntax all that well, so I ask e.g Cursor "write me a for loop skeleton" and then use that for loop as a starting point for my own code. Is this mod now AI slop?

Once again, it depends on who you ask. In my opinion, no, this is still not AI slop and still doesn't require a disclaimer about AI use, but it is at the point you may want to consider adding a disclaimer about AI and how you used it. The reason I say that it's fine, is that while the AI might have built a starting part, you're still doing the actual work yourself and making sure the end result is quality. The AI is just a tool you're using to gather information and organize it into an easier starting point.

The reason I say you might want to consider adding a disclaimer about the AI and how it was used at this point though, is that there are people out there who really don't want to use stuff that AI has directly helped generate assets for, even if those assets were heavily edited by a human. So labeling, even if perhaps not technically necessary, helps them to make an informed choice about the mod they're about to use.

9 hours ago, Grym7er said:
  • Second to last one: If you answered no to any of the above, then at what point would you feel it is required of the author to use an AI Use Disclaimer in their mod's page?  
  • Last one: if the author has an AI usage disclaimer in their mod page, do you even bother to read it or does that immediately disqualify the mod as AI slop, even if the disclaimer says "I used AI to translate banane (german) to banana (english)"?

I'm just gonna try to answer both of these at the same time, because they go hand in hand. If an AI has been used to generate some code/assets as a starting point, and the results modified by a human to ensure quality, I think it's a good idea to be open about how AI was used in order to make sure that people can make informed choices about the product, but it's not absolutely necessary either. If AI was used to generate most of the code and/or other assets, with little editing, then it's no longer just being used as a starting point and needs to be labeled, as at that point it's the AI doing most, if not all, the work.

I think the main problem is that a lot of people offload all the mental work onto the AI as well as the physical, accepting whatever the AI spits out as "good enough" without bothering to actually check code to make sure that it works or edit images/sounds to ensure that there's no mistakes, etc. Some will also use AI as a convenient way to cut out the human element and keep costs low, in an effort to make more money. No one likes low-quality products, especially not when it seems to be a result of cutting corners to chase fame and money or lack of care by whoever made it.

For me personally, I'm not a fan of AI use, so I'll generally ignore stuff that's clearly labeled as AI or otherwise appears to be AI. That being said, it also depends on how the AI was used, which is where clear labeling comes in handy. AI content is still a hard sell, but if it looks like the creator took the time to edit the assets and check code to make sure that the final product is as good as it can be, then I'm more likely to use the product since the creator clearly cared about the product and did their best to ensure quality.

1 minute ago, Teh Pizza Lady said:

I'd like to gently suggest that the frustration is being aimed at the tool rather than the lackluster behavior. The actual problem isn't the use of AI, but how it's used, whether the user was actually engaged with the tool or just blindly turning wrenches until something worked. Those are entirely different scenarios and conflating them means you're keeping a gate that blocks people with a real vision and minimal know-how from making that vision a reality and creating something with potential.

THIS. Hot take incoming, but at this point I'm also just as inclined to ignore stuff that advertises itself as "well at least it's not AI!", since in many cases the product is low quality and I'm really not a fan of things that appear to be labors of spite rather than love. Just because I don't want AI products doesn't mean I stopped caring about the quality of said product, and if the best advertising a product has is "well at least it's not ____", that immediately tells me the product has no real merits of its own to stand on.

Basically, whatever tools someone is using to make a product, make sure that the tools are used appropriately and that the product is as good as it can possibly be. There's always going to be haters, but if it's clear that someone took the time and effort to produce quality, then there shouldn't really be any issues.

  • Like 3
  • Thanks 1
Posted (edited)
1 hour ago, KahvozeinsFang said:

You haven't answered the question. What you've done is flip the question and turn it into an debate about non working mods. I'm not interested in non working mods, because we all know what we do with mods that don't work, we uninstall and we move on. Same thing I would do for any mod that didn't work.

Its like: "oh my car doesn't work properly, therefore cars are bad"

 

 

What I'm interested in hearing about are PROPERLY working mods, and the argument some have on why they shouldn't be used.

 

All I've ever heard is the "AI BAD" argument. So please write as many paragraphs as needed to explain that to me. I'd love to be enlightened. 

It's not flipping the question, it's addressing the elephant in the room. We can't just say "ignore the elephant for the moment" if you actually want to understand why people are acting weird. That is why it needs to be disclosed, that is why they are "worse." Of course, if a human and an AI write the same line of code, there is little difference (mostly, see below).

If you develop a car that has a 1% chance of being assembled improperly and self-destructing after 3 months, you have a bad car. Cars with higher failure rates get the reputation even if a specific instance is fully working. If you could screen for an improperly manufactured car, maybe we can work with that. But we can't. Mods are often closed source and screening code for correctness is intensive. So instead, it's better and wiser to avoid the models of car with that flaw in their production line.

We don't have an oracle that can identify "working" vs "non-working," so it's unreasonable to ignore it.

The other half of it is more philosophical and social. I have two main issues with code generation AI models.

1. Code generation models are trained on vast swaths of open source (and even closed source) code that was released under certain licenses. These models can reproduce those swaths of code readily. That's not surprising, so can every other flavor of model. But by doing it here, big corporations are being given the ability to launder open source code into closed source code against the wishes of those who made it. A swarm of agents can rewrite software in a new language and a new license, allowing people to take advantage of the hard work of others without abiding by the license that they accepted that code under. I don't think this benefits anyone to gift corporations with the ability to embrace, extend, extinguish in bold new ways.

2. Use of AI is seductively self-destructive. It can be used well. It can be used as an effective learning tool. It can be a nice rubber duck to debug against. But using it without exercising your own skill or by letting it shortcut your own struggle is already showing signs of diminishing the skills backing it. If you're using it to make mods as a fresh newbie, if not used properly as a learning tool, it can sabotage your own education. I'm not saying this in a "back in my day we had to walk to school uphill both ways" kind of way. Even for using AI, a clear view and a clear mental model of the program is important, and AI itself can interfere with both of those.

Edited by Diff
  • Like 4
Posted
7 minutes ago, Diff said:

If you develop a car that has a 1% chance of being assembled improperly and self-destructing after 3 months, you have a bad car. Cars with higher failure rates get the reputation even if a specific instance is fully working. If you could screen for an improperly manufactured car, maybe we can work with that. But we can't. Mods are often closed source and screening code for correctness is intensive. So instead, it's better and wiser to avoid the models of car with that flaw in their production line.

One thing I want to note on this example: mistakes can happen in either case, but humans can be held accountable for mistakes or otherwise negligent behavior. Can't really do that with computers, at least, not as easily. A computer will do exactly what it was told to do, because it can't actually think about consequences. A human is capable of considering the implications of instructions and choosing whether or not to act on them, and how.

  • Like 1
Posted
12 minutes ago, Diff said:

2. Use of AI is seductively self-destructive. It can be used well. It can be used as an effective learning tool. It can be a nice rubber duck to debug against. But using it without exercising your own skill or by letting it shortcut your own struggle is already showing signs of diminishing the skills backing it. If you're using it to make mods as a fresh newbie, if not used properly as a learning tool, it can sabotage your own education. I'm not saying this in a "back in my day we had to walk to school uphill both ways" kind of way. Even for using AI, a clear view and a clear mental model of the program is important, and AI itself can interfere with both of those.

AI is a crutch - but a crutch isn't the problem. Refusing to heal is. The risk isn't that AI makes things easier. It's that easier makes it tempting to stop struggling, and the struggle is where the actual learning happens. AI will always be faster than an actual developer. Understand what it's doing, what you're asking it to do and why is key to understanding how to use it. When it inevitably goes away, what will we be left with? Developers that don't know how to code. I'm seeing it happening already.

The danger lives in the developer's habits, not the technology. AI doesn't prevent you from learning. It just makes avoidance cheaper. That's a discipline problem, not an indictment of the tool.

  • Like 2
Posted
3 hours ago, KahvozeinsFang said:

As long as the mod works the way it should, then what does it matter how it's made?

Well look at it this way: remove "mod" and replace it with other products, like food, cars, clothing, etc. How something was made can make a big difference in whether or not someone wants to use the product, even if the product itself is objectively good.  You don't have to agree with someone's reasoning on why they choose to use or not use a product, but it's important to make sure that potential customers have enough information to make an informed decision about their choice.

To use food as an example, the food might be very tasty and didn't make me sick after eating it, but if it was made in a dirty kitchen or used a lot of filler ingredients/sourced things from questionable places, or the business doesn't treat their staff in a way I think is fair, then I'm not going to be inclined to give them my money. Though I think it's also fair to point out that if questionable practices were used in the creation of the product, the product's quality most likely isn't going to be very good either, since quality products require quality methods to produce.

Posted (edited)
42 minutes ago, Teh Pizza Lady said:

AI is a crutch - but a crutch isn't the problem. Refusing to heal is. The risk isn't that AI makes things easier. It's that easier makes it tempting to stop struggling, and the struggle is where the actual learning happens. AI will always be faster than an actual developer. Understand what it's doing, what you're asking it to do and why is key to understanding how to use it. When it inevitably goes away, what will we be left with? Developers that don't know how to code. I'm seeing it happening already.

The danger lives in the developer's habits, not the technology. AI doesn't prevent you from learning. It just makes avoidance cheaper. That's a discipline problem, not an indictment of the tool.

This is a lot better phrased than what I wrote, and is largely what I was trying to say. Only thing I'd push back on is that it's always a discipline problem. Sometimes you just don't know better. Sometimes AI is being used to skip work, I saw that all the time when I was a teacher. But even as a learning tool, it's hazardous. AI sounds confident. It's easy to be taken in.

Most recent job I had was at a print shop. Boss had no prior print shop experience and had taught himself a lot from AI and YouTube. But he also trusted its answers and its own confidence in its ability way too much. Some of it was small stuff, like thinking "welding" and "hemming" were the same thing. Other things were actively destructive, like thinking he can buy this roll of material for hundreds of dollars when it would actually be incompatible with our equipment because he didn't know to ask a question that he didn't know. Or getting ready to print a graphic that AI says is 300dpi at 84" across when it's actually 21dpi because he doesn't know AI can't do that.

If you know how AI works, it's easy to label this is all silly user error and easily avoidable if you understand, but the average person doesn't know how LLMs operate or what they're good or bad at. People can think they're learning when they're actually just being taken for a ride with nobody at the wheel.

Edited by Diff
  • Like 3
Posted (edited)

My only real beef about "AI" coding is the "AI" adds its own code to its library. All it takes is one crappy human coder (and there are a lot out there) to write poor spaghetti code for the "AI" to coding like that. Assembly spaghetti has its uses, sure, but I can't think of any in a higher level language. At the moment, well over half the coding could be done by AI, and probably all of it that is outsourced or H1b-d.

Creative, shmeative. Human coding will soon be niche. Fighting it is like buggy whip manufacturers fighting the internal combustion engine.

[EDIT]

For the time being, the niche for humans will be cleaning up "AI" code. But we are not too far away from having the best of the human coders cleaning up "AI''s codebase.

"AI" still has trouble with recursion, but then again, so do most humans. I think there will also be a niche for the really gifted coders to develop new algorithms and methods. If you are a Knuth, "AI" is not a threat.

Edited by Thorfinn
  • Like 1
Posted
2 minutes ago, Thorfinn said:

My only real beef about "AI" coding is the "AI" adds its own code to its library. All it takes is one crappy human coder (and there are a lot out there) to write poor spaghetti code for the "AI" to coding like that. Assembly spaghetti has its uses, sure, but I can't think of any in a higher level language. At the moment, well over half the coding could be done by AI, and probably all of it that is outsourced or H1b-d.

Creative, shmeative. Human coding will soon be niche. Fighting it is like buggy whip manufacturers fighting the internal combustion engine.

"It's inevitable" only according to the people who desperately need that to be true to get their bonuses at the end of the next quarter. I see no reason to trust them when I look out at what AI code hath wrought. Even after all the hype about what agentic coding can do, look at Cursor's web browser and Anthropic's C compiler. These are flagship research projects completed by the people most familiar with their own tech, and they're staggeringly lackluster. Some smaller projects have better luck, like Ladybird's LibJS rewrite, but especially combined with what send to be diminishing returns, I don't think there's any guarantee or even likeliness that we're looking at the end of horses or horse drawn carriages. Especially when this new car relies on ground up horses to fuel itself. 

  • Like 5
Posted
5 minutes ago, Thorfinn said:

Maybe. But the Model T was nothing compared to a modern car, or even a '56 Chevy or '32 Ford Coupe. Innovation will happen, assuming we don't all die in a nuclear blast.

While I do agree that AI annihilation of human society is the next most likely world end scenario after nuclear world war three, the current state is not scary at all; from a "AI is replacing us all" perspective atleast. Innovation can surely happen, and many people having bet their big stacks of cash all on green certainly pray it happens, but considering what a frail house of cards AI as an industry is built upon I still wouldn't expect dystopia to knock on the door tomorrow.

  • Like 2
Posted

I don't fear "AI", in large part because I don't believe it is possible. Definitely not any time soon. Apart from spiritual explanations, we don't have any idea what it is that changes a bunch of neurons from a brain into a mind. That has to be solved first, regardless of what even bright people like Dawkins think.

Mankind has this thing where it is never satisfied with what it has. It always wants more or different. So there will always be demand for stuff humans do, even if "AI" removes all the drudgery. It will just take creative human minds to conceive of things that "AI" cannot do, or cannot do well enough, then fill that need.

There will always be something for people to do to make a living, and various automation schemes will make lifestyles easier to afford. But this is getting a little far from "AI" creating mods.

Posted
1 hour ago, Diff said:

I don't think there's any guarantee or even likeliness that we're looking at the end of horses or horse drawn carriages.

I mean despite the fact that cars and other motor vehicles are the main method of modern transport, horses and carriages still have their place. The Amish community, obviously, uses them heavily, we have touring carriages at various parks and festivals, traditional cowboys and mounted police, etc. There's also Mackinac Island, where motor vehicles aren't allowed except for certain emergency services.

So yeah, safe to say that while AI might become a more widespread tool, it's not going to replace actual human involvement.

Posted

Ignoring all the ethical and functional concerns with AI, one of the most common social concerns is that the "author" of an AI mod isn't invested. Perhaps it shouldn't matter, but people absolutely care that you care about your mod as much as they do and AI bypasses that initial investment. They wanna know that you're gonna be around to fix and expand it. They need to believe that you understand it enough to work on it.

Many AI guys just copy and paste code and assume it's fine, if it compiles and works. What they don't see is that the AI was trained on joke responses from stack exchange. They go the long way around, touch things they shouldn't, and leave crazy gaps. Maybe you're a super pro engineer that uses AI "properly" and would never.. but the other 8billion vibe coders would and do. You're gonna carry their baggage on this one -> because people are executing your code on their machines and lack the technical knowledge to vet you.

Any code or asset produced by the AI makes it an AI mod. People get bad ideas and do weird things with or without AI, so I wouldn't say a disclaimer is needed for that. Any AI disclaimer is a hard skip for me. It's actually led to me just avoiding mods in most games, at this point. Too much hassle to dig through the slop.

I'm not concerned about quality. I've just picked a hill to die on. 

Note : I actively tried to use AI to help with my Vintage Story mod because all the professional developers I know swear by it. I hate C# and the prospect of not having to learn it again was appealing. It was a massive waste of time and I ended up dropping it after around 20 hours. Just learning C#, how to mod Vintage Story, Archipelago, and the C# Archipelago integration library was so much faster/easier/more pleasant than trying to get the AI to produce one usable line of code. Skill issue, sure, but I already learned a language for talking to computers. Translating back into English just to have it poorly translated back into programming language is so dumb. 

Posted
2 hours ago, Crabsoft said:

Many AI guys just copy and paste code and assume it's fine, if it compiles and works. What they don't see is that the AI was trained on joke responses from stack exchange. They go the long way around, touch things they shouldn't, and leave crazy gaps. Maybe you're a super pro engineer that uses AI "properly" and would never.. but the other 8billion vibe coders would and do. You're gonna carry their baggage on this one -> because people are executing your code on their machines and lack the technical knowledge to vet you.

This is why my computer science classes when I was getting my degree were 50 the first semester, 20 the 2nd semester, 15 the next and so on. By the time I got to my 4th year of university, we were down to 4 of us.

The 30 students that dropped out or changed majors after the 1st semester were copying and pasting their answer. I remember one kid who said he had an uncle who was a top developer at some firm and would help him out. He showed up with a hand-written note and zero clue of what he was doing. AI is now enabling these folks to pursue whatever it was that first drew them to Computer Science in the first place.

But make no mistake, if I put joke responses on Stack Exhange or code that compiles but does nothing on my GitHub, someone copies it or an AI picks it up and makes a mess out of someone else's program, that baggage is on them, not me. Using code you don’t understand has always been risky, whether it came from Stack Exchange, GitHub, or an AI. If a person chooses to borrow code they don't understand, they are alone in bearing the weight of responsibility for the outcome. Nobody forced them to take shortcuts. Nobody forced them to borrow without asking what it did. Nobody forced them to use it without vetting it. Those were their choices and theirs alone to make.

As the old saying goes, "You made your bed, now lie in it."

Posted
1 minute ago, Teh Pizza Lady said:

But make no mistake, if I put joke responses on Stack Exhange or code that compiles but does nothing on my GitHub, someone copies it or an AI picks it up and makes a mess out of someone else's program, that baggage is on them, not me. 

 

Just to be clear, I wasn't saying that you are responsible for bad practices. I was saying that it's unavoidable for "good" AI guys to be associated with the reputation of "bad" AI guys, for now. It's not fair but it is necessary to deal with this reality. 

  • Like 3
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.