Teh Pizza Lady Posted Wednesday at 01:00 AM Report Posted Wednesday at 01:00 AM Just now, Crabsoft said: Just to be clear, I wasn't saying that you are responsible for bad practices. I was saying that it's unavoidable for "good" AI guys to be associated with the reputation of "bad" AI guys, for now. It's not fair but it is necessary to deal with this reality. Ah I see that now... yeah your wording did NOT give that impression when I first read it. 1
Tj Pepler - Critcher Posted Wednesday at 01:48 AM Report Posted Wednesday at 01:48 AM Why oh why does this keep coming up? This post just like every other one like it before will be locked by the mods by tomorrow because it's gonna get heated and lead nowhere, not that your questions aren't valid it's just that level heads do not prevail in this arena. Personally, I have made no attempts to hide that I use AI tools in my workflow. I have a little robot I post right on the main image to identify I use AI. If you don't like AI, and you see the little robot, walk away, don't post, don't tell me your opinion, just walk away. - I plan and design my own mods without AI help - I scaffold my projects by hand - I write all the preliminary code as manually as you consider vscode with tabbing manual - I have build templates I made and I let the AI run the build cause it's stupid grunt work - I have the AI run tests for many hours overnight running and rerunning tests and outputting errors to a log I can audit manually - I then test and run across many games manually to make sure it runs with different mod loadouts - When I have a bug I can't identify instantly I always ask the AI for it's opinion, this can save sooooo many hours and HAS saved me so many hours. - Then deploy Anyone who doesn't like that can go play with other mods, no hard feelings. (But I have some cool mods and people seem to like them) 4
Farmore Posted Wednesday at 03:09 AM Report Posted Wednesday at 03:09 AM To me it's not about whether or not the mod is "AI slop". For someone who, because of philosophical and ethics concerns (which I won't be getting into because it's long and mostly personal), avoids gen-AI tools entirely and limits/ends use of services that add it retroactively, it's about whether it's something I can live without and what the impact on my life is. Example: It would be one thing if a big company retroactively input AI into a service you were already using and need for something like work or school, and I'm always searching for more and more alternatives to popular apps for this reason (though admittedly it's a shrinking circle). That sort of thing comes with trade-offs. Are the trade-offs something I can live without? Do I have another job I can go with instead? How do I assure I pass my class so I get a good future? Are there alternatives? In that sense, if my only choice is to use AI to avoid failing or losing out on a job I really need, then I may have to contend with unwanted AI use. What about social media, which I use to keep up with important news, my communities, and friends? If I got rid of it entirely, would I still have any friends left? What good is lost by exiting the online communities I've solidified myself in and not contributing? These answers are open-ended. It's up to you as the individual. The question is really this: Is it something I can live without? Do free optional mods for a video game made by randoms on the internet meet the same level of questioning? That's an easy answer. It's just no. It's not a service, it's not something I need to do for work or school (to put food on the table). It's a mod. It has no impact on my life, it's not requiring me to change my habits, find alternatives, or seriously weigh the impact of the service on my life to conclude whether or not I want to keep it! So the answer is... I'll just skip this one irrespective of whether or not it's "slop". It's just "I'm not engaging" to all these questions. Having said that... a more direct answer to your questions: 1. Wouldn't say it's "slop" per se. Most likely it pulled directly from a random public source? When I think of "slop" I think of something truly vibe-coded to its core. It's been AI-assisted... at the least? I just don't think this is like... SLOP, you know? 2. Again not "slop" per the definition of "vibe-coded to its core". But it is still AI use... AI-assisted? "Slop" just carries a certain connotation. I dunno. 3. Still not "slop" but if you're doing this for every other line of code, well, it's gonna turn into slop. 4. Hard to say because I don't think there's a fine rule on this. If gen-AI is so intrinsic to your mod that if you removed every instance of it, it would literally break and you couldn't fix it yourself, that's AI-slop. Everything below that is AI assisted at least, but I still personally choose not to engage. I mean really it depends, but it's teetering the line. I'm not sticking around to find out. 5. Yeah if I see an AI-disclaimer I'm just skipping lol. tldr; Gen-AI is just hella optional to me in my daily life and I'm immediately skipping out on anything that uses it if it bears no weight on my life. If that makes me a hypocrite about other things I still choose to engage in, than yeah, I'm a hypocrite. We're human beings afterall, there isn't a playbook for this sort of thing. But mods are just so insignificant to me in the grand scheme of things I'm just not going to bother with any mod with an AI disclaimer or whose creator has admitted to using AI in any capacity. Which isn't a diss at the people who make mods just me putting my energy elsewhere. 1 1
Dark Thoughts Posted Wednesday at 04:02 AM Report Posted Wednesday at 04:02 AM 20 hours ago, Grym7er said: So I think vibe coding a simple mod is totally fine, achievable and maintainable (if you can foot the bill for vibecoding), but beyond that, rather learn how to do it yourself. I actually think vibe coding in a non professional setting is fine, although I still would argue in favor of a disclaimer regardless. A lot of people simply are not capable to learn to program themselves. I for example tried when I was younger, but I have very little ability to focus and concentrate (which got even worse over the decades). Programming also requires pretty much constant learning. It's an ever evolving thing where you have to continue to educate yourself on changes & additions all the freaking time. Not everyone is willing to spent that much time of their life on that. I see this more like people doing image gen for personal use, but maybe they think someone else might have interest in the mod they wanted too and decide to share it for those who may be looking for something similar. The only other options here are a) making mod suggestions and pray (I don't think modders really even lurk in those type of sub forums) or b) be rich enough and find someone to commission it - but that comes with similar caveats regarding maintenance, or even just finding modders who do commissions in the first place.
Broccoli Clock Posted Wednesday at 07:30 AM Report Posted Wednesday at 07:30 AM People are lazy, people are stupid, people use AI. It's that simple. If you could code, you would, you can't so you cheat. 1
Rainbow Fresh Posted Wednesday at 07:45 AM Report Posted Wednesday at 07:45 AM 14 minutes ago, Broccoli Clock said: People are lazy, people are stupid, people use AI. It's that simple. If you could code, you would, you can't so you cheat. While that is certainly true for many people, what about actual software developers with a full-time job and a decade of hands-on manual coding experience using AI tool assistance to speed things up or skip unnecessarily tedious tasks? 4
Grym7er Posted Wednesday at 07:46 AM Author Report Posted Wednesday at 07:46 AM 7 minutes ago, Broccoli Clock said: If you could code, you would, you can't so you cheat. I dunno, the logic seems flawed. "If you could do math, you would, you can't so you cheat (by using a calculator)" "If you could eat your food, you would, but you can't so you cheat (by using cutlery)" "If you could write, you would, but you can't so you cheat (by using a word processor)" "If you could code, you would, you can't so you cheat (by using StackOverflow in 2020)" These are all cases where it is entirely feasible and acceptable to do the things without the 'tools of cheating'. How is using AI to code any different? I will clarify: I am not talking about vibe coding at all, but using AI to write code if you understand the code and the problem. Coding is merely explaining to the computer how to solve a problem using a language that the computer understands. With AI, computers are starting to understand natural language explanations of how to solve a problem, so what's the difference? 2
Broccoli Clock Posted Wednesday at 08:07 AM Report Posted Wednesday at 08:07 AM 10 minutes ago, Rainbow Fresh said: While that is certainly true for many people, what about actual software developers with a full-time job and a decade of hands-on manual coding experience using AI tool assistance to speed things up or skip unnecessarily tedious tasks? Ok so this is actually two problems, one being the devs, the other being those who make the final sign off. The people who give the final OK to code are middle managers who couldn't give a single f*ck if the code is good, bad or just ugly. They don't care if it's open to vulnerabilities or is difficult/impossible to refactor, they don't care if the code reinvents the wheel and is built in isolation. What they care about is whether it works for that one particular situation for that one particular instance. AI also doesn't care about those things. As for the devs, that in itself seems to split into two parts. Either they are lazy and will rely on any old sh*t to get a pay packet, or they are forced by management to use it because management will be replacing those devs with AI (and probably have for the most part), so in essence they are cutting off their own throats (career wise that is). It's like a guilty person building a gallows. 10 minutes ago, Grym7er said: I dunno, the logic seems flawed. Your response is so flawed that I genuinely suggest you don't take the safety labels off your furniture. 2
Rainbow Fresh Posted Wednesday at 08:12 AM Report Posted Wednesday at 08:12 AM (edited) 12 minutes ago, Broccoli Clock said: As for the devs, that in itself seems to split into two parts. Either they are lazy and will rely on any old sh*t to get a pay packet, or they are forced by management to use it because management will be replacing those devs with AI (and probably have for the most part), so in essence they are cutting off their own throats (career wise that is). It's like a guilty person building a gallows. Hmm. I see. Welp, guess I am making a living with "AI Slop" then, and my still remaining many, many hours of good old work are just inefficiencies I have not yet outsourced to the great AI cloud. Voluntarily, mind you. That however also means, especially regarding your other statement 12 minutes ago, Broccoli Clock said: Your response is so flawed that I genuinely suggest you don't take the safety labels off your furniture. that I'd recommend you stay away from the internet, as I can 100% guarantee you that you are interacting with "AI slop" on the daily just by doing anything out there. Edited Wednesday at 08:20 AM by Rainbow Fresh
Grym7er Posted Wednesday at 08:12 AM Author Report Posted Wednesday at 08:12 AM 4 minutes ago, Broccoli Clock said: Your response is so flawed that I genuinely suggest you don't take the safety labels off your furniture. Please do elaborate. How else will I learn?
Broccoli Clock Posted Wednesday at 08:25 AM Report Posted Wednesday at 08:25 AM 2 minutes ago, Grym7er said: Please do elaborate. How else will I learn? You literally equated using a calculator as akin to using AI to help build code. I'm not here to educate you. 6 minutes ago, Rainbow Fresh said: Hmm. I see. Welp, guess I am making a living with "AI Slop" then, and my still remaining many, many hours of good old work are just inefficiencies I have not yet outsourced to the great AI cloud. Voluntarily, mind you. You say that like it's a badge of honour yet you are literally killing the very industry that gave you your career. Such a short sighted mindset. 7 minutes ago, Rainbow Fresh said: that I'd recomment you stay from the internet, as I can 100% guarantee you that you are interacting with "AI slop" on the daily just by doing anything out there. If you wanted to move the goalposts you probably should have said so. Are we talking developers making a curated choice, which was your point, or are we now talking about the endless different types of shitty "AI" that is out there? The latter has a much greater significance because it seems like an awful lot of very very stupid people out there will believe any old shit they are told (or shown) and are actively being manipulated by it.
Grym7er Posted Wednesday at 08:35 AM Author Report Posted Wednesday at 08:35 AM 1 minute ago, Broccoli Clock said: You literally equated using a calculator as akin to using AI to help build code. I'm not here to educate you. The concepts are similar, though no analogy is perfect. At the root, both are tools that make challenging tasks easier for and more accessible to an average person. We've just gotten extremely accustomed to being able to do fairly complicated maths on a device that everyone carries around in their pocket. A few years from now, it is fairly plausible that any John Doe with a phone will be able to quickly prompt an AI to build a tool to solve a very specific problem or challenge for them right then and there. and so you can quickly see how AI coding could become as standard in day to day life as a calculator. I'm not saying it will, because I'm of the opinion that the AI industry is built on matchsticks and will probably collapse. AI assisted coding isn't likely to disappear, though, it's too efficient not to use (again, not vibe coding). 1
Rainbow Fresh Posted Wednesday at 08:38 AM Report Posted Wednesday at 08:38 AM 1 minute ago, Broccoli Clock said: You say that like it's a badge of honour yet you are literally killing the very industry that gave you your career. Such a short sighted mindset. I am saying that like it's the reality I have been working with for the last several month and I don't feel neither threatened to be replaced by AI nor pressured to use it. I am working with it to a sane extent, know it's very flaws and shortcoming aswell as the areas in which it can arguably improve my own work efficiency without sacrificing any of the "human seal of approval" quality of the end result. I can also assure you that I do care about the projects that I work on, not from a paycheck perspective but from a "I just love coding and I just love to create something that makes people's lifes easier" perspective. If you can find the headspace to believe this is or not I leave up to you. 5 minutes ago, Broccoli Clock said: If you wanted to move the goalposts you probably should have said so. Are we talking developers making a curated choice, which was your point, or are we now talking about the endless different types of shitty "AI" that is out there? The latter has a much greater significance because it seems like an awful lot of very very stupid people out there will believe any old shit they are told (or shown) and are actively being manipulated by it. I am not moving goal posts, I am merely trying to showcase the main flaw in your buckshot bird hunt approach of logic, like Grym7er tried, clearly to also no avail. Following your "it's that simple" logic and using your previous explanation of any sort of AI use in coding is just a no-go, immedate sign of "AI Slop", disengangement and capitalist greed: This very forum we are typing in right now is, as the copyright notice at the bottom of the page shows, powered by a product from "Invision Community". Seeing as AI disclaimers, especially for minscule "behind the scenes" use are not really a thing and I can't be bothered to sign of for a free trial to try and snoop around their source code I have no definitive proof, but statistically speaking the sheer size of Invision Community and their product fleet essentially guarantees a near 100% probability of atlest one developer on their team having used AI tool assistance to atleast the same extent as I do. Following your previous explanations, that makes this forum software "AI Slop" and Invision Community a greedy corporate not "giving a shit" as long as it works and makes money. Furthermore, because Anegos Studio decided to use this software instead of making their own, 100% human work fueled alternative, they in sequence also do not care, as long as it works and makes them money. Finally, you are here, having paid money for one of Anegos Studio's products (bought Vintage Story) and are now actively engaging with "AI Slop" software. Because your buckshot logic is so wide-spread, essentially anything on the internet created or updated in atleast the last 5 years is tied to AI Slop at some point, and if that is such a huge no-go bother, not using the internet as a whole the only cure. However. I have said my opinion, you have said yours. It is clear that the two us will not come to any agreement, so I will now abandon this specific chain of discussion and will kindly ask you to do the same, as I do not want to make the warning of the guy a couple messages up come true. 2
Crabsoft Posted Wednesday at 08:43 AM Report Posted Wednesday at 08:43 AM I'm a total AI hater/non-believer, but you should never argue from a position that your opponent is stupid/evil/ineffective. Even if you have to frame it as potential, give them the steelman treatment and defeat the ideal. Otherwise, you'll always be talking about entirely different perceived realities. Assume that it can do all the amazing things they claim. Assume they have had a different experience than you. Anything less is shouting into the void. 3 2
Broccoli Clock Posted Wednesday at 09:23 AM Report Posted Wednesday at 09:23 AM (edited) 49 minutes ago, Grym7er said: The concepts are similar, though no analogy is perfect. At the root, both are tools that make challenging tasks easier for and more accessible to an average person. We've just gotten extremely accustomed to being able to do fairly complicated maths on a device that everyone carries around in their pocket. A few years from now, it is fairly plausible that any John Doe with a phone will be able to quickly prompt an AI to build a tool to solve a very specific problem or challenge for them right then and there. and so you can quickly see how AI coding could become as standard in day to day life as a calculator. How to take an already tortured analogy and torture it more, then add a Motte and Bailey at the end. Not an ideal response to be fair. 49 minutes ago, Grym7er said: I'm not saying it will, because I'm of the opinion that the AI industry is built on matchsticks and will probably collapse. AI assisted coding isn't likely to disappear, though, it's too efficient not to use (again, not vibe coding). AI is going to replace everything, and it's got nothing to do with what you or I do. It is all down to money, that's it, that's all companies care about. AI is "free"*, humans are not. If several centuries of companies literally choosing the worst option purely because it's cheaper doesn't convince you of that, I'm not sure what will. I'm not some King Cnut here, I'm not trying to stop the tide from coming in, I am standing on the shore watching, knowing we're fucked and knowing that I am watching the "end of coding" - at least generic coding, bespoke will obviously still exist for a limited time, but not forever. * AI is obviously not free, but middle management will consider it in that manner. 46 minutes ago, Rainbow Fresh said: I am saying that like it's the reality I have been working with for the last several month and I don't feel neither threatened to be replaced by AI nor pressured to use it. I'm sorry but that's just naive. Let's take an example shall we, I worked with web dev for most of my career so I'll use that sector of IT. Traditionally you had 4 main parts of a web dev front end team; designer, coder, manager/backend liaison, and client liaison. What is a front end web dev team look like now? Nothing, because literally every job is now replaced by AI. Coders? No need. Designers? No need. Manager? No staff so no need. Client liaison? No need. Every single stage in the front end development of a web app is now effectively able to be done by "people" who don't need a paypacket, don't need time off, don't need an office. Do you think that is attractive to upper management who care little for anything but the balance sheet? Also, do you think that is beneficial to the industry. Look around right now, do you still have traditional font end web dev teams? Some do, but most have dispensed with the majority of them, tending to bring in one contractor to perhaps oversee the automated tasks, but in time that will be removed too. It'll literally be like an API library. Now in some ways that was always coming, again just using the front end as an example, we've seen bespoke code being replaced by libraries then back end integration to the point now where you are pulling down a react library and a whole set of templates just to get started. The move to being completely automated was always happening, AI is just an accelerant but one with some significant downsides. So are you really sure your job cannot be done by AI? If you are in any position along the development chain then it can be, so whether you feel threatened is up to you, but if "your hunch" is good enough for you then fine. 46 minutes ago, Rainbow Fresh said: However. I have said my opinion, you have said yours. It is clear that the two us will not come to any agreement, so I will now abandon this specific chain of discussion and will kindly ask you to do the same, as I do not want to make the warning of the guy a couple messages up come true. Believe it or not, I do understand your position, it's just that someone who is clearly as technical as you are should not be so blinkered to both the current and future harm it does and will do. That's at all levels, whether it's coding specific or it's more existential. As for replying that's your call. Open forum and all that... 40 minutes ago, Crabsoft said: I'm a total AI hater/non-believer, but you should never argue from a position that your opponent is stupid/evil/ineffective. Even if you have to frame it as potential, give them the steelman treatment and defeat the ideal. Otherwise, you'll always be talking about entirely different perceived realities. Assume that it can do all the amazing things they claim. Assume they have had a different experience than you. Anything less is shouting into the void. I never said AI was ineffective, it is however inefficient because it's based on weighting and not reality. We have numerous examples of AI outputting both "stupid" and "evil" solutions to things. Although that's not AI's fault, it doesn't know what good or evil is, and if it tried to work it out it would rely on a vox populi method of judging morality based on the source material it was trained on. That material would be things like X or Reddit posts, then with an extra weighting added by whoever is running it. Neither of those things are designed to provide "moral" results, just "popular" ones. The tedious Musk demonstrated this with aplumb with his constant manipulation of Grok so that it would support his mental world view. So, no, AI is not "stupid" or "evil" but the people who control it are. If it was a case where we working in hypotheticals I would agree with your point, but we are well into the level of "this has to be a fucking parody" when you have people like Thiel advocating for global genocide so that his AI could rebuild the world (in his perverse, bigoted and hateful image). If you think I'm being hyperbolic about that, he has gone onto podcasts to say this exact thing. This is the same Thiel that has direct access to the US government and all its data. Now granted that's not directly related to someone using copilot to restructure a mess set of nested loops, but it's not unrelated either. Edited Wednesday at 09:24 AM by Broccoli Clock
Grym7er Posted Wednesday at 09:27 AM Author Report Posted Wednesday at 09:27 AM 2 minutes ago, Broccoli Clock said: How to take an already tortured analogy and torture it more, then add a Motte and Bailey at the end. Not an ideal response to be fair. Alright, I'll do what Rainbow Fresh did and just accept that I'm not going to get a reasonable conversation going here. Have a good one. 1
Broccoli Clock Posted Wednesday at 09:31 AM Report Posted Wednesday at 09:31 AM Just now, Grym7er said: Alright, I'll do what Rainbow Fresh did and just accept that I'm not going to get a reasonable conversation going here. Have a good one. Here's the thing bud, that response is performative. You don't need to tell anyone you aren't replying, you just don't reply. However, here's the thing, you asked a question in an open forum and when you didn't like the answer you gave a factious answer and got annoyed when you got a similar response. If you don't like people giving replies to your questions, then don't post them. It really is that simple. 1 3
Thorfinn Posted Wednesday at 03:04 PM Report Posted Wednesday at 03:04 PM 6 hours ago, Broccoli Clock said: The people who give the final OK to code are middle managers who couldn't give a single f*ck if the code is good, bad or just ugly. And why should they? By the time the code becomes obsolete, there will be a couple new versions of ChatGPT (or whatever) that "writes" better code. The better coders could be spending their time improving the codebase that ChatGPT uses to create it's Frankenware. A decade or so ago, politicians who were putting coal miners out of work counseled, "Learn to code." Now with all the electrical demands of data centers, they are going to start telling laid-off coders, "Learn to mine coal." 1
Teh Pizza Lady Posted Wednesday at 03:21 PM Report Posted Wednesday at 03:21 PM 13 hours ago, Tj Pepler - Critcher said: Why oh why does this keep coming up? This post just like every other one like it before will be locked by the mods by tomorrow because it's gonna get heated and lead nowhere, not that your questions aren't valid it's just that level heads do not prevail in this arena. Personally, I have made no attempts to hide that I use AI tools in my workflow. I have a little robot I post right on the main image to identify I use AI. If you don't like AI, and you see the little robot, walk away, don't post, don't tell me your opinion, just walk away. - I plan and design my own mods without AI help - I scaffold my projects by hand - I write all the preliminary code as manually as you consider vscode with tabbing manual - I have build templates I made and I let the AI run the build cause it's stupid grunt work - I have the AI run tests for many hours overnight running and rerunning tests and outputting errors to a log I can audit manually - I then test and run across many games manually to make sure it runs with different mod loadouts - When I have a bug I can't identify instantly I always ask the AI for it's opinion, this can save sooooo many hours and HAS saved me so many hours. - Then deploy Anyone who doesn't like that can go play with other mods, no hard feelings. (But I have some cool mods and people seem to like them) this is an EXCELLENT use of AI. 11 hours ago, Dark Thoughts said: Programming also requires pretty much constant learning. It's an ever evolving thing where you have to continue to educate yourself on changes & additions all the freaking time. Not everyone is willing to spent that much time of their life on that. 10,873,498,712,634% TRUUUUUE. this hurts to read. So much time spent on keeping up with current standards TT_TT 5 hours ago, Broccoli Clock said: If you don't like people giving replies to your questions, then don't post them. If you don't like people rebutting your heated opinions, then don't post them? Pendulums swing both ways until physics says they don't.
Teh Pizza Lady Posted Wednesday at 03:23 PM Report Posted Wednesday at 03:23 PM Now is also a good time to remind folks that you can ignore forum users you find troublesome. You can do it by hovering over the user's picture icon and then clicking "Ignore" in the bottom right of the window that pops up. 1 1
Thorfinn Posted Wednesday at 03:37 PM Report Posted Wednesday at 03:37 PM (edited) 6 hours ago, Broccoli Clock said: We have numerous examples of AI outputting both "stupid" and "evil" solutions to things. Although that's not AI's fault, it doesn't know what good or evil is, and if it tried to work it out it would rely on a vox populi method of judging morality based on the source material it was trained on. Exactly! What is needed is humans to go through and strip out the "stupid". (I'm not going to add "evil" because most people insist on inserting political definitions of that term, rather than an objective one.) The stupid part of training is mostly done. There is little point to crawl through StackOverflow yet again. The chance of anything truly innovative is vanishingly small, probably exactly zero. Now comes the part where you have to skim off the dross, which is, as everyone notes, obscenely large. But as it gets cleaned up, you will need fewer but better coders to continue the work. And here's the rub -- they have to be not only good, but also temperamentally appropriate. You cannot allow even prima donna programmers access to that data if they are also the type to throw their sabots in the gears. But coming out the other side, like @Grym7er points out, pretty much everyone will be able to "create" a custom tool for any job using a smartphone. Think Scotty in Star Trek IV(?) trying to give the formula for transparent aluminum by speaking into the mouse. "Computer. Computer? Oh, a keyboard. How quaint." Edited Wednesday at 03:39 PM by Thorfinn
Teh Pizza Lady Posted Wednesday at 03:54 PM Report Posted Wednesday at 03:54 PM (edited) 20 minutes ago, Thorfinn said: Exactly! What is needed is humans to go through and strip out the "stupid". (I'm not going to add "evil" because most people insist on inserting political definitions of that term, rather than an objective one.) The stupid part of training is mostly done. There is little point to crawl through StackOverflow yet again. The chance of anything truly innovative is vanishingly small, probably exactly zero. Now comes the part where you have to skim off the dross, which is, as everyone notes, obscenely large. But as it gets cleaned up, you will need fewer but better coders to continue the work. And here's the rub -- they have to be not only good, but also temperamentally appropriate. You cannot allow even prima donna programmers access to that data if they are also the type to throw their sabots in the gears. But coming out the other side, like @Grym7er points out, pretty much everyone will be able to "create" a custom tool for any job using a smartphone. Think Scotty in Star Trek IV(?) trying to give the formula for transparent aluminum by speaking into the mouse. "Computer. Computer? Oh, a keyboard. How quaint." The other side of this that people aren't seeing is that AI figured out that my custom bootloader and kernel I'm building in Assembly and C as a learning project had a bug where it wasn't disabling blink mode for white backgrounds so when the bootloader ran in Oracle VirtualBox and tried to print text with a white background, the text started to blink. I had no idea this was a thing and was able to grab some code to disable this. QEMU and Bochs had blink mode disabled in the BIOS already, but VBox didn't. I never would have known where to look if it weren't for AI because Stack Overflow didn't have any answers on it and asking lead to a bunch of "this question has already been asked before" comments and my topic was closed. EDIT: What I'm implying but didn't state is that this means that AI is able to reason past things that aren't necessarily black and white taken from coding sites. Because I've searched deep and wide for a solution to that problem and never found anything. I turned to AI out of desperation and it found the issue within...... minutes. Edited Wednesday at 03:59 PM by Teh Pizza Lady 2
Thorfinn Posted Wednesday at 05:13 PM Report Posted Wednesday at 05:13 PM I wouldn't impute the ability to reason quite yet. Any more than it was reasoning when it reads chest x-rays and detects pre-cancerous growths that the best radiologists can't see even if you point it out. What it excels at is both noticing tiny details, and, more importantly, being able to correlate these details across multiple seemingly unrelated fields. That's the guy whose future threatened by "AI" -- the Renaissance Man.
Diff Posted Wednesday at 06:37 PM Report Posted Wednesday at 06:37 PM (edited) 1 hour ago, Thorfinn said: I wouldn't impute the ability to reason quite yet. Any more than it was reasoning when it reads chest x-rays and detects pre-cancerous growths that the best radiologists can't see even if you point it out. What it excels at is both noticing tiny details, and, more importantly, being able to correlate these details across multiple seemingly unrelated fields. That's the guy whose future threatened by "AI" -- the Renaissance Man. This isn't the first I've seen this claim, and I can see how it could be true of some artificially intelligent system... but I don't see any present evidence for it. Instead, my own experience hints at the opposite. LLMs are blinded by a magnetic pull into well-worn ruts in the training data. Which to me makes immediate sense based on how these models are trained and operate. Way back you mentioned its struggles with recursion, and I think that's a good indicator of what I'm talking about. Like a lot of people, I have a few troublesome tasks in my back pocket I use as benchmarks when a new model comes out. One of those is having it toy around with the Gleam programming language. It knows Gleam, it's not a tiny homebrew language. It can tell you all about Gleam, recommend popular libraries in the ecosystem, tell you about how it interacts with the BEAM. It can tell you the exact rules it will then go on to violate when it comes to generate code, even with precise rules, tools, tips, and reminders in the context window. It fails to reason about something it knows about because the probabilities point elsewhere in the distribution of training data. This goes also for projects in common languages (JavaScript) with atypical ways of doing things. One of my pet projects is a 3D engine that renders to 2D SVG. From SVG it inherits many quirks that are deeply atypical in a 3D world. AI is quite helpful as a research tool here, because the things I'm doing are not unknown mathematically, they just violate convention. But those broken conventions means that when it comes time to write, it constantly gets pulled off the rails by stronger signals in the training data. Edited Wednesday at 06:51 PM by Diff
Thorfinn Posted Wednesday at 10:21 PM Report Posted Wednesday at 10:21 PM 3 hours ago, Diff said: I can see how it could be true of some artificially intelligent system... but I don't see any present evidence for it. Instead, my own experience hints at the opposite. LLMs are blinded by a magnetic pull into well-worn ruts in the training data. Agreed. Which is why I said it needs to move away from LLMs. There's practically nothing more it can "learn". Its LLM is now at the stage it needs to be pruned -- it has "learned" too many poor programming practices. Stupid human tricks, with the resulting GIGO. To advance, it's going to require people who know what they are doing to go through it and ruthlessly say, "This is crap." People who are not ideologically opposed, who would sabotage it, at least at the margin. Again, though, I do not like the idea of calling any algorithm intelligent or capable of reason. That is not in evidence, and, to the best of my knowledge, not something you achieve by just throwing more petaflops at it. No one has yet demonstrated that to be true, anyway. So far, the closest we have is the assertion from strict materialists that the brain is just an organic computer, and since the human brain is conscious, the silicon brain is, too, once you add enough transistors. Pure faith. Or circular logic, depending on how you want to look at it.
Recommended Posts