Hacker Newsnew | past | comments | ask | show | jobs | submit | more crackalamoo's commentslogin

I don't think this article is paywalled? You can just click "I'll do it later" on the banner


To some extent, AI needs less extensive teaching than the problem-solving skills taught in school because it works in natural language anyways and builds off whatever else you know. You can easily talk to an AI about CS if you know CS, or even use it for coding, but not so much if you only know "AI".

It's not that learning AI isn't important, it's just that school isn't necessarily the place for it: AI is always changing, and can be learned relatively easily online, while properly learning something like linear algebra or operating systems online is pretty unlikely except for a very motivated student.


I made a custom GPT of myself using my blog. It understood who I am, but wasn't able to replicate me very well, and mostly sounded like generic ChatGPT with some added interest in my interests.

I would imagine fine tuning with enough data would be different though


Your output is the map, the map of your experiences. If you make a map of the map (by training a model on your output / the map), this is two abstractions away from the human being experiencing the world with all the errors and uncertainties encoded in both maps.


Prolific authors, such as Hitchens (passed away in 2011) have been convincingly by duplicated by AI.

https://www.youtube.com/watch?v=0qIdEteK0VE


How could you tell without being very close to him personally?


I think standardizing scientific terms and the like is a very minor concern when it comes to preserving endangered languages. Most science is done in English anyways, and that's unlikely to change.


> Most science is done in English

I think that depends, many sufficiently large regions do STEM in regional languages. English is just a language for trade - where there are no need for international communication, there can be little reasons to fall back to English.


I think the commenter you're replying to might have been referring to pictograms like many Chinese characters, not icons that you have to figure out. Or maybe not


> pictograms like many Chinese characters

A handful of Chinese characters are pictograms. As far as I recall, it is by far the smallest class of characters, and all of them, including the ones that started as pictograms, are treated by modern readers as phonetic indicators.

Compare e.g. 象 to https://img.zdic.net/zy/jinwen/33_E87E.svg .

They are the same character. Does that help you if you're looking at 象?


Can machine learning accomplish the same? See e.g. Microsoft MatterSim

https://www.microsoft.com/en-us/research/blog/mattersim-a-de...


isn't that what Nvidia Omniverse is?


Yeah, that's the ambition.

More generally, ML is good at approximating many NP-hard problems efficiently, so I wonder if it will be a more practical alternative to quantum computing for things like molecular simulation.


You can do both. Preserving the corpus and building the LLM probably gives the best chance for future generations.


But of those 126,301 people who have edited in the last 30 days, some of them have edited more than one article. In fact, some have made up to millions of edits (lifetime), which disproportionately increases the total. At least 5000 people have edited more than 24,000 times.

https://en.wikipedia.org/wiki/Wikipedia:List_of_Wikipedians_...

(And also: each editor has (approximately) 2 eyes :) )


As far as I can tell, OpenAI really did invent RLHF in its current form. https://arxiv.org/pdf/1909.08593


It probably doesn't matter much either way, but I use vanilla HTML/JS


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: