I wrote about my dad some time after he had passed away some years ago.
I wrote about my stepfather after he had passed in February.
And I wrote about my father in law after he had suddenly passed away in May.
Every single time it was hard. emotionally. But it was a cathartic moment at the same time. I remembered what made me feel connected. What they meant to me.
To outsource this to a machine would have robbed me of emotions, wonderful memories and closure.
But I have to say from experience, that writing about them was a totally different thing than to read these words at their funeral.
This was very, very much a different beast. It still makes me tear up thinking about it. It still makes me choke.
I think if you read the piece it's not really about "outsourcing" the writing. It almost feels like the machine's guesses, though not necessarily great writing and not necessarily factual, give the author the courage to flesh out her own writing.
I’m surprised by some of the backlash—it felt to me nearly like a form of digital Ouija board; given a prompt and some subconscious input, a vague or nonsensical response can feel like communion.
Author using AI as a form of algorithmic MDMA to get in touch with her feelings. I can see how she selects AI passages of deeper reflection and inspection.
Yes, it feels fake from the surface level to pick from words that AI generates. But the same principle applies to a psychologist's suggestive words during a consultation, IMO.
I really like the MDMA comparison, it truly pinpoints it. It seems like most of the people in this thread not agreeing with the process or even showing their hatred for it miss the point. AI in this specific case is not meant to replace a writer, but rather to act as a tool that a writer (or anyone) can use to get everything off their chest and access to what had been hidden from them for a while.
Now, people have been arguing whether or not this article is an ad. To be honest, I do not care. The story was moving and it highlights the remarkable progress of AI in writing, which is all that matters to me.
Personally, I found the format of progressively adding to her words and letting the AI fill in the rest quite interesting.
Her sister probably died some years ago (based on her being in junior high at the time, while she's already worked 15+ years reporting on the tech industry) so she probably has some emotional distance and felt like she could be experimental.
However, the last story where most of the words are the author's and the AI only adds a handful of phrases was the most poignant to me.
Assuming the article showed exactly what GPT-3 filled in, it's an impressive accomplishment. It almost feels like auto-tune for writers.
I've also been quite impressed with the quality of work that comes out of Google Translate over the last few years. I use Russian to English and English to German regularly. They are both good for rough cut translations that you can quickly polish. Translate uses AI extensively to improve the translation outcome. [1]
I’m not sure I know how to write about how sad and awful I find this article to be. I also can’t think of a worse way for someone to use my death than to write a tech review for a product, much less this completely worthless product. This whole thing reeks of phony.
Calling GPT-3 a product in this context is disingenuous. As it's a fairly new tool many people have been experimenting with it since it's come out. This article is a poignant exploration of using GPT-3 to tell stories. It feels very experimental artistic to me.
Well, each person is entitled to their reaction, no matter how uncharitable. But I'll say that my impression of the piece was nothing like an advertisement and I thought it was moving.
Apparently not being able to recognize ads is a popular trend. I believe South Park dedicated a whole season to this phenomenon. “Is it news? Or an ad?”
I find it really difficult to write a proper eulogy. So difficult in fact that for the several ones I've been involved with, I usually leave it to someone else with a seemingly higher degree of emotional intelligence than myself. I just can't seem to separate how their death makes me feel from the uplifting things that a eulogy is supposed to say. Here I am, still wondering what truly matters in life, and so don't believe I even have the right to summarize the value of someone else's. There's a conventional way these things get presented, but my belief in conventional ways is not strong enough. That's why I prefer to let someone else do it. Truth be told, I'd probably use a service like this the next time I needed to write one.
I highly recommend that if you don't have the ability, for whatever reason, to write a eulogy, find someone else who can do it authentically. An authentic and effective eulogy is about emotional connection. If you find yourself unable to write one, admit you cannot write one, and don't turn it over to an ai or even human service - find a human being with an authentic emotional connection to the deceased who can write it instead.
downvoters should at least have the fortitude to respond why they disagree.
yeah, this struck me as a cruel comment, if only out of carelessness. imho the comment author shouldn't feel good about striking at someone who's hurting, whether they approve of their coping mechanisms or not
Since I’m not the one who used my sisters death to sell a product - and I mean really read what she wrote, she is purposely insinuating that she is having a love affair with the product. “I’ve never read such an accurate Modern Love in my life”, “I felt acutely that there was something illicit about what I was doing” , “One night, when my husband was asleep” - I think I’m being quite nice about it. If I knew this person in real life I don’t believe I would be as nice.
There’s no product being sold; the Believer isn’t a particularly commercial magazine. Implying something feels thrilling and illicit like an affair doesnt mean it is an affair. The sister is dead, no one needs to worry about her. Loved ones who are coping with loss may do so in unconventional ways, and owe nothing to appease your cynicism.
AI writing believable and coherent words is both fascinating and terrifying, particularly to writers. Then arises the instinct to use it as a creative tool, followed by the impulse to edit and workshop the output.
1. You nailed it no one cares about the dead sister… not even the author of this piece.
2. No it’s not a real affair because you can’t have an affair with inanimate objects.
3. I am a writer (so your No True Scotsman implication doesn’t work here) and it’s neither fascinating nor terrifying. The only people who feel threatened by it are bullshit artists worried they will get out bullshited.
It wasn’t a no true Scotsman, but simply stating an area where generative content has a particular implication. You clearly read it closer than many would, and I almost commented on that. I also think that perhaps some background on the publication and author might relieve some of your specific cynicism.
As for fascination and terror, it is not about GPT-3 itself, but the larger question of our utility in the face of an imagined AI’s eventual capability. Heck, some degree of novelty can be generated by Markov chains, chatbots or Mad Libs.
Further, the fact you are implying the dead are more important than the living dealing with loss would make me a the values by which you determine what is sacred.
If you write for the web you are in denial. In the next 5 years either you'll be using an AI tool that you gradually feed input to and edit the output or you'll be replaced by somebody that is doing just that.
I'm curious how informed your opinion is by the way. You say you're a writer but that doesn't mean much in this context since this technology is very new and most people aren't catching on yet. Have you tried out the recent tools like conversation.ai? Do you use tools like MarketMuse?
They already use AI and more straightforward techniques to write many articles for the Web, such as market reports. But for creative writing I'm less persuaded, especially when it's not like you have to pay through the nose for writers.
I think I should have been more specific, I was talking more about SEO copywriting than creative writing.
Writers aren't too expensive you're right - I pay $.04-.06 per word for the 10 that I employ. But depending on the scale, level of competition, and return potential bringing in more AI gets attractive and could be necessary at some point.
My point in my comment was just that if you're a writer and aren't paying attention to these new AI tools you could end up getting blindsided at some point or you might see your pay decrease and be left wondering why
Yes the problem is that I just don’t understand a how to use this fresh new innovative product that will sweep the world by storm with its ability to take the burden and hard work out of the whole writing process. If only I see the light through these amazing stupefying mind-befuddling product demos…
And no I do not write ad content for the web nor will I ever get a computer to write my feelings for me. Not in 5 years, not in 10 years, not whatever you tell your up-incoming investors. The fact that writing about personal grief and marketmuse are being equated as like things is rather disturbing, in a sociopathic/psychopathic sort of way.
Judging by all of these comments I believe I
may just be in the wrong place and that hacker-news is no longer for people like myself.
I don't see how you could have possibly read the piece and come to that conclusion. The point of the article is not at all that the AI does a great job and makes it easy to write with no effort.
I wasn't asking if you watched a product demo I was asking if you'd actually used them. My company does, and they are greatly helpful to our writers.
I was talking about copywriting more than whatever it is you do so I should have been more clear. Good job rage quitting HN btw, hopefully you feel better today lol.
"The fact that writing about personal grief and marketmuse are being equated as like things is rather disturbing, in a sociopathic/psychopathic sort of way" MarketMuse doesn't write content so this just shows you don't know what you're talking about
I guess there's no accounting for taste. I found the piece a haunting and poetic meditation on grief and recovery.
I will tell you how it felt for me. I felt I had lost half of myself. I felt I had lost my right arm. I felt I had lost my left leg. I felt I had lost my tongue. I felt I had lost my heart. I felt I had lost my mind. I felt I had lost my eyes. I felt I had lost my ears. I felt I had lost my breath. I felt I had lost my voice. I felt I had lost my smile. I felt I had lost my laugh. I felt I had lost my tears. I felt I had lost my future. I felt I had lost my past. I felt I had lost my parents, as well. I felt I had lost everything. I felt I had lost everything.
And yet, I did not lose everything. I did not stop being me. I did not stop existing. There were things I could do: I could make my bed, I could wash the dishes, I could walk the dog, I could feed myself, I could live in the world.
I guess there really isn't. To me this reads like a kindergartener proudly listing all the body parts they know. Or some program repeating words it was given (which is what this actually is). It's not deep or touching at all because it's so comically bad.
It's amazing and raw. Not like a kindergartener. Like a person who is experiencing feelings without a clear sense of place or reason. I thought the prose was an achievement for a computer, and the sequential format allowing the AI to finish each story was a novel medium. The AI was a kind of mirror, extrapolating the mood and content of what was written before.
The AI portions as the chorus of a song carrying the mood of the more specific bits written by the human.
I also thought the bit I quoted was evocative of my own process of grief. At first you are shattered. You've lost half yourself. You've lost everything. Then, in the slightest most mundane ways, you heal. I can make the bed. I can live in the world.
It seems pretty clear that something in all this is really bothering you. That's reasonable, but do you suppose there might be more effective ways, than shotgunning insults across a comment thread, to try to work out why it gets to you so?
Insulting others as having mental problems because you aren’t able to relate seems to belie a pretty large lack of tact, much less an ability to empathize.
To protest against pretentious people caring about how their death is used I am going to claim that anyone can do with my body, ashes, or anything, whatever they want since I won't be alive any way to care.
Hey, please don't post personal attacks to HN, regardless of how wrong someone else is or you feel they are. Maybe you don't feel they deserve better, but you definitely owe this community better when posting to it.
I don’t think human authors should be worried yet, the AI versions all seemed like what I would expect from an AI. Just soulless words with no insight or deeper meaning which we find in quality writing.
I don't know, I thought some of them were great. "I thought my body had died without telling me", for example, I found surprisingly good, and some other parts I found insightful.
What did you find soulless? I wonder if you'd have the same opinion if you didn't know it's GPT-3.
I loved it, but yes I would say those are the right critiques. It worked very well for how it was used to decorate and echo her original text. It couldn't have gone on twice as long with the AI bits without completely falling apart.
"AI" predicts what a human would probably write next. I don't know how that's someday supposed to replace writers, in the same way I don't understand how AI trying to copy artists is a replacement for Van Gogh?
No, the ending seems completely in line with what the model does well. What is impressive about GPT3 is that it seems good at keeping track of really long-term dependencies in long sequences of text, giving the impression of being able to hold on to trains of thought, a problem that's plagued generative natural language models for quite a while. I suspect that this is part of why people who have been working in or following that field for a while hype it so much. It's a pretty mechanical result when I put it like that, but it's also something that used to make AI-generated writing extremely obvious, and now that one cue is less likely to give it away
It's also quite good at coming up with some good plausible items to continue a list with, which actually gets it into trouble in terms of sounding like human writing sometimes, like in the absurdly long list of body parts featured in one generative example
So that's why I think it was a pretty plausible generated ending, though of course it's curated here, so it might not have been the first thing the model came up with. The concept of "existing" as a verb had been developed multiple times in the essay as something the author felt she had lost when her sister died, and the thought that the author felt she didn't exist was often tightly associated with thinking about the loss directly in a way that's quantifiable without that sophisticated of a language model, so along with math, which makes sense if your list is of life skills broadly that you could learn from someone, the model bringing up "existing" when there was a convenient list of verbs associating with the sister is, while also a beautiful and fitting way to end the essay, exactly the kind of trick GPT shows itself to be quite good at
What I like to think about with GPT is to what extent this just shines light on how we as humans compose text. For instance, that kind of long-term association/reuse process you describe sounds like a stylistic technique a human would use. Sure, GPT doesn't have full understanding of why it uses the technique, but watching it "almost do it right" is a great way to learn how style works in isolation, imo.
Sort of. You can see that each successive version of the essay contains what the author had written before, to which she had added more. She then lets GPT3 continue from wherever she's left off each time. It's actually an inspiringly clever way to use such a model
I had the same assessment of the technology -- not exactly ready to replace a human writer -- but there was something poignant about the exercise nevertheless.
When trains first ran, it caused great moral upheaval. Old men waving their canes at the machines. Photographs were seen as a cheap alternative for "real" paintings.
What causes the outrage at these models and their output? Is it fear? Notice how critics often deem it necessary to use disparaging remarks:
"soulless, no insight or deeper meaning which we find in quality writing, completely cliched, ignoble, phony"
Well I’m not speaking for everyone, but since I think I’m the only one who used the word “phony” here, I’ll speak for myself. I used that word not to disparage the tech (the tech I simply called worthless, as it’s not worth even discussing) but the person writing the article. If I were her family I do not believe I would ever speak to her agin until she retracted her product advertising in which she shamelessly used her sisters death to help sell. Her words are what I found disgusting.
I’m also 31 and obviously into tech to be here, so not an old man waving sticks at trains. What bizarre assumptions you have.
After a period of significant novelty, GPT-3 reveals itself to be something other than a magical fountain of ingenuity. However, ‘useless’ is going too far when talking about it as a writing tool. It can serve a similar purpose as generative games and techniques such as ‘exquisite cadaver’ or free writing.
One of the difficult things to do in art is to make something that makes some kind of sense but doesn’t simply follow an obvious logical path. Robert Frost put this as ‘no surprise for the writer, [then] no surprise for the reader.’ These leaps, or ‘illogical conclusions’ occupy a non-polynomial space of endless parameters. It’s like a traveling salesman problem where we aren’t sure where the salesman wants to be or how he traveled, until he has made it to his destination. Generative AI can help search the problem space.
you're really harping on the "product advertising" angle. If you dropped that angle completely, if you didn't think this was about that at all, would you still be mad? Why?
I mean the author ultimately agrees right. It's not like she keeps the first iteration and calls it a day, satisfied that she's eulogized her sister. Without the central conceit of her fleshing out the piece more in response to each generated prompt, I don't think it would be worth reading.
I'm surprised by the antipathy towards both author and technology expressed by many of the top level comments here. Personally I found the piece moving, even from the first generated sequence. By the last iteration I had tears in my eyes.
I admit I might be biased. I also have access to the GPT3 beta and have used it for this same purpose of writing assistance. To me GPT3 as of now is like a super advanced thesaurus, a tool that you can use to broaden your perspective and get out of a rut in your writing.
> I do believe in ghosts. Not the ghosts of the dead, but the
> ghosts of the living. The ghosts of people who, because of
> a trauma, have lost their sense of themselves.
This is on par with the best human literature that I've ever read. If this is where GPT-3 is going, then I might be coming along too.
This page behaves really oddly on mobile. Huge swaths of blank space, and the byline repeats itself on every section. And I had to dismiss two modals before seeing any content. Unreadable.
I'm on desktop, and something in the site design just pushed me away.
I can't really say what, except the site itself, not the content, was somehow repellant. The email modal gave me an initial feeling of dread: I'll have to fight this site . Then the site itself felt it would give content only slowly. The colors were just ugly. I gave up without reading much, and I'm pretty sure if I see any other URL containing believermag, I'll just skip it, no matter the content.
Weird, I've never reacted like that to any site, even a recent flickering puke-yellow on black one was bearable. But somehow not this. And I can't even say why.
It worked for me, but it’s easy to imagine why it wouldn’t on another device.
The text is made to very slowly render in line-by-line as you scroll. If you scroll quickly, it’s very easy to get way ahead of the rendering. Maybe it’s trying to prevent skimming the article or skipping to the end?
Anyway, the page was riddled with affectations like that, and all the affectations seem like they’re designed to make you feel both uncomfortable and fascinated, like when find a dead animal covered with maggots…obviously the article fits that aesthetic as well.
Literally within milliseconds, this site popped a full page interstitial asking me to sign up for their writing right in my inbox. Reading the other comments here I don't feel guilty for closing it immediately. Like no, I haven't even read a single word, you jerks.
Why should they be punished? All you do is press X and it goes away. This is hardly onerous, especially given that they are providing interesting content for free. It's far more trouble to get a book from the public library.
I find this kind of writing troubling but perhaps for a different reason than many people. Writers draw on their experiences as well as previous readings to create new works. Every such work has a provenance that reflects its roots, sometimes imperceptably but at other times in ways that are obvious to every reader. Knowing these roots also helps readers understand the author's contribution, how it is different from things that came before.
What are the sources of GPT-3's writing? Clearly there were texts that expressed similar feelings, otherwise it would not be able to infer new ones. Is it really creative and if so how? Is this just a new kind of plagiarism? How would you know?
> When I started running, I didn’t even know how to run. I started out running only a half a mile, a mile at the most. I’m not kidding. I was a mile-a-minute man when I started. I remember the first time I ran a mile in under five minutes. I was running on one of my training runs on a Sunday night.
Most likely it's mixing in the phrase "mile-a-minute" with running because writing about running would usually include "miles" and "minutes" in a lot of the text.
The magical thinking people have around "AI" is incredible. Why is it that people don't get that gpt-3 can't think? It has no concept of the world. It has no concept of meaning. It has no concept of anything. It just statistical extrapolation.
Suggested alternate title: "I didn't know how to write about my sister's death, so I used big data-powered autocomplete to do it for me".
I am as nonplussed about this as if the author bought a stock condolence card from Hallmark. Some people know how to write from their own personal feelings and emotions, and other's don't. That's why the greeting card industry even exists.
I'm sure the greeting card industry will be all over longer-form condolence messages such as this using big data powered autocomplete systems like GPT-3.
Why not have AI propose to your would-be spouse, because you find it too hard? Or have AI give the "I want a divorce" speech?
The eulogy isn't "supposed" to be anything, and you can't do it wrong. No one, and nothing, can do yours better than you can. If someone criticizes it, they're not a person you want in your life.
I think this has a use case. Not the current implementation, but there are people out there who truly do need help with writing eulogies. They're hard to do, and something to make it easier would be a benefit. I think the author's story here shows how that struggle can play out. Funerals themselves are hard. Coordinating a ceremony that is supposed to coincide with everyone's processing of emotions about the loss of a friend/loved one is stressful. I'd might even argue that morning the loss of a loved one according to a schedule is unnatural itself. Now you've got to write an essay about someone's life on top of that?
Personally, it's exceptional to have a relationship of any where I feel like I truly understand the person deep down. Without that understanding, how can I write a proper eulogy? Furthermore, how do I get the funeral audience to all agree with what I write? Maybe I can describe their general accomplishments in life, but would did graduating from that school really matter to them? What if they hated being there? Were they proud of their time in the military or not? Yes, they had a career, but is that how they wanted to be defined? Presenting any of these as unimportant might hurt the sensibilities of those who came to the funeral, not just in terms of their understanding of the deceased, but also in their general sense of what's valuable in life. Y'all want to hear hilarious story about that one time we almost got arrested? The most honest and accurate thing I can say about the deceased is what their life meant to me, but no one comes to the funeral to hear about me. So usually I let someone else more in tune with what people in general want take the lead on it.
The above challenges contributed to my decision to include an outline for my own eulogy in my will - as an optional guide, at least. This removes the guesswork from those who would be burdened with my funeral arrangements. Not everyone plans ahead on these things, and so a resource for writing a eulogy can be valuable to some. It may not be an answer to the burning meaning of life question, but it an AI might help the process along.
It's 2021, and maybe it is time there's actually an app for that. I don't mean it in a dystopian sense. Certainly I'd hope it would be offered by a non-profit and not some cryptocurrency startup seeking an ICO. Unless you're an oligarch or ancient ruler, coffins costing more than a used car is a characteristic of modernity. Recently it was announced that FEMA's given $1 billion towards Covid-19 funeral costs since deaths started climbing in 2020. Funerals are expensive and stressful. This is undesirable. Funerals should be efficient where possible. If an app can help with that, then some families would be grateful for it.
GPT-3 is a big data version of autocomplete. Just has a much bigger database and longer training than any autocomplete on your phone. But it's the same idea.
There's no "knowledge" here.
If instead of calling GPT-3 "AI" and instead we called it "Mega-Autocomplete" would you be just as wonderous about the outcome?
I wrote about my stepfather after he had passed in February.
And I wrote about my father in law after he had suddenly passed away in May.
Every single time it was hard. emotionally. But it was a cathartic moment at the same time. I remembered what made me feel connected. What they meant to me.
To outsource this to a machine would have robbed me of emotions, wonderful memories and closure.
But I have to say from experience, that writing about them was a totally different thing than to read these words at their funeral.
This was very, very much a different beast. It still makes me tear up thinking about it. It still makes me choke.