Yep. It would generate absolute shit if we tried it today.
Yep. It would generate absolute shit if we tried it today.
Get out of the “hookup culture” or you will never find meaningful human connection. The FOMO is real, and once dismissed you life will improve.
Right, that is the thing you said. Except:
pulling the next most likely word out of a dataset
again, writers practicing their craft in earnest, for free, as a part of a gift culture is on the opposite end of the spectrum as having their work used by a chatbot who exploited that to create a for-profit product.
I’m sorry but, “most likely word?” Like writing a story were some sort of equation or puzzle? Like, I'm honestly not clear what "likely" means in the context of a creative work.
But if a kid read your fanfiction and wanted to write their own, that would be perfectly fine with you? Even if their stories were almost carbon copies of yours because they were a little too inspired? And this would be fine because of romantist concepts like “humanity” and “soul”?
And of course the tremendous irony of fanfic writers going, “Hey! They used my work without my permission! That’s just wrong!”
I’m a software engineer and yeah I look at it and see exactly what was on the tin. the software trained on publicly available texts just as any writer would and can now generate new content using what it trained on. the work wasn’t “stolen” just like the fanfics did not “steal” the work that they are writing fanfics…
I’m struggling to understand the distinction between asking a human to learn how to write fanfic versus asking a machine to learn it.
Isn’t it the case that every writer has “trained their model” off of the hard work of the authors that came before them? If you’ve borrowed a book from a library you’ve even done so at virtually no cost to yourself. Now... there obviously isn’t a clear 1:1 correlation between how humans refine (train) their art and…
Any PoI reference gets a star from me.
Society, at least in the US, is headed away from religion even without AI influence. In 2007 16% of Americans identified themselves as religious "nones", atheists, agnostics and the like in a Pew Research poll. By 2021 that figure had increased to 29% in another poll by Pew Research.
The problem with AI development and the eventual creation of an Artificial Superintelligence(ASI) is that once we start on the path towards it, the genie is already out of the bottle. The US could put all kinds of regulations on development of an AI but that won't stop work on one in China or Europe or any company…
Came here to say exactly this. Of course, that scenario optimistically assumes we don’t snuff ourselves with nukes or global warming induced collapse.
the reason the nuclear analogy is apt isn’t the capability or the work it takes— on those you are right it’s more like trying to control biological and chemical weapons— it’s that once military AI exists it changes the geopolitical landscape like nuclear weapons did. If you have them and your enemy does not, doesn’t…
The problem with any attempt at regulation or limits is twofold— first that we have a global economy secondly the first-mover advantage is so enormous that it has serious risks.
I hope the next sentient life form that develops on this planet evolves empathy and intelligence requirements in order to function.
Have you considered that there may be other reasons for researching life extension? Scientists want this research to grow as well and have multiple aims for this sort of research, such as: