MercuryCobra
MercuryCobra
MercuryCobra

Nah man one hedge fund went under years ago but that’s the only time anyone with money felt the pinch here. Ever since then it’s been nothing but delusional bagholders.

Now playing

There’s an excellent YouTube video exploring why silicon based life is likely extremely rare, and why we should mostly expect life (as we know it) to be carbon-based in most circumstances:

This exactly. Most people drive <40 miles a day, and usually not all at once. That means that 39 miles of all electric range, with any amount of charging during the day, would be more than enough to cover 90% of the driving people do without burning a drop of gas.

This exactly. Most people drive <40 miles a day, and usually not all at once. That means that 39 miles of all electric range, with any amount of charging during the day, would be more than enough to cover 90% of the driving people do without burning a drop of gas.

Shhh you definitely don’t want to be saying things like that where already riled up authors can hear. I can’t tell you how many angry twitter replies I’ve gotten from authors whose work I adore telling me to die in a fire because I think maybe their kid’s kid’s kids shouldn’t be able to dine out on the one book

A) Last I checked we don’t know for sure that the information was obtained unlawfully.

You’re preaching to the converted about how AI isn’t anything like what its hype men would have us believe. My crime here was using the phrase “understand” in a way that you object to, which, fair enough. I never meant to imply AI can do or in this case was doing any kind of critical thinking, nor that it can or does

This is extremely uncharitable. The guy never claimed the machine got it right, just that the machine getting it wrong isn’t a problem. Which is correct; nobody is harmed by the fact that a machine sometimes doesn’t understand how art works, except the people paying for the crappy service claiming it can. Are authors

So, I am a lawyer, and at one time was an IP lawyer for a brief bit, and I think you’re being overzealous in your interpretation of copyright law. Let me ask you this: if I went to a library, checked out a dozen books, painstakingly tallied every time a unique word shows up in each book and some other data about those

Categorizing words in order to do statistical analysis on their use is always going to mean miscategorizing them sometimes. But that’s not copyright infringement, nor unethical, it’s just a reality of linguistic analysis.

Commercial use is always a factor in deciding whether a use is fair, but it’s not always the factor. The Google case actually spends a long time talking about how Google actually does have a commercial aim in doing the work it did, since it promotes other work they do. The case decided this simply wasn’t enough of a

Yeah this is a cynical take but I think ultimately the correct one. The optimistic version of it is that AI won’t end up taking very many real peoples’ jobs because it’s shitty at doing those jobs. Machine learning as currently conceived can’t and will never be able to produce real art, just poorly constructed, quickly

Man I spent a good chunk of yesterday trying to explain this to folks and there is extremely vehement resistance to being educated on this topic. I get it, people are scared of generative AI, and companies have and will continue to use it for terrible ends (including G/O Media. Fuck y’all btw). But this is a classic

I am a lawyer, and used to be an IP lawyer, and I strongly disagree. Though it’s next to impossible to say for sure whether something is fair use outside paradigmatic examples, I think Prosecraft has an extremely good argument. Despite the sturm und drang about AI there’s no evidence they were actually creating an LLM

I think you’ve unfortunately been misled about what Prosecraft was doing. I agree that AI training is legally and morally fraught. But there’s actually no indication Prosecraft was training an AI. They definitely weren’t producing an LLM or other kind of generative AI. At most they were developing an AI assistant that

I mean, are they? Prosecraft didn’t reproduce the works, it scraped them for data and then reproduced that data. That seems arguably within the bounds of fair use under Authors Guild v. Google.

But he didn’t share copyrighted content, AFAIK. He shared data analysis about copyrighted content. He didn’t reproduce the actual books themselves or any element of them...again, AFAIK.

This is a silly and way overbroad concept of false information. The data analysis here isn’t trying to summarize or otherwise explain the plot of the book. It’s literally just looking at word choice, word frequency, etc. and then categorizing those choices. Of course it will sometimes misidentify whether a given

My understanding is that what the “AI” was doing was basically just generating statistics. Like, number of times a given word was used, page length, etc. The analysis was basically just arranging that data in interesting ways. There wasn’t any real analysis you needed a human being to do, he just automated the

They probably can’t even harness the energy produced. The whole reaction happened at the center of a huge focused laser combustion chamber--there’s nowhere to put the plumbing to get the heat generated to anywhere else.