Hacker News new | past | comments | ask | show | jobs | submit | knowuh's comments login

I can't tell you how much I love that your first example of contrast failure is HN.

> Our favorite orange website isn't leading by example, either. Some comments are almost unreadable


It's worth pointing out that HNs 's heavily downvoted posts are intentionally hard to read. Thats hardly a gotcha. Now whether they should be is a whole different question.


Orca is amazing, and it's creator Devine Lu Linvega is inspiring too.

Listen to this future of coding podcast where he is interviewed about Orca: https://futureofcoding.org/episodes/045 and about making your own tools: https://futureofcoding.org/episodes/044

The Future of Coding podcast is a treasure.

Edit: excerpt from the Devine Lu Linvega's intro:

  -----
Devine Lu Linvega and his partner Rekka live on a sailboat. He makes art, music, software, and other cultural artifacts. When Photoshop’s DRM required that he maintain a connection to the internet, he wrote his own creative suite. When his MacBook died in the middle of the ocean, he switched to Linux with hardware he could service. His electricity comes from solar panels, and every joule counts — so that’s out with Chrome and Electron and in with Scheme, C, assembly, and maybe someday Forth.

  -----


"His electricity comes from solar panels, and every joule counts — so that’s out with Chrome and Electron and in with Scheme, C, assembly, and maybe someday Forth."

I used to live and program off grid, too. With a setup, I could carry all in my backpack ... so I can say, it mainly depends on the hard- and firmware in use. So my pure linux laptop did not last very long. Even with allmost only texteditor use

But my optimized rugged chromebook does last a long time, and with only modest sunshine -> unlimited worktime - with extensive use of chrome and electron.


What rugged chromebook do you use?


Acer C201.

But it might be not avaiable anymore but I think the successor is similar.

I put it in dev mode and worked mainly with chrome dev tools as IDE and a simple texteditor for node scripts. That worked well and I did not needed more.

There is in theory a linux vm, but last time I tried, it was buggy and performance intense, so no option for my days work, but good to have the opportunity to have more powerful tools at hand at times, like inkscape, because chromeOS as itself is not so nice to use and very limited in every way, but what works, works.


Think I've read somewhere that those can be reloaded with pure Linux to replace Chrome OS. Not saying you should, only nothing that one probably could.


Most recently, he's actually rewriting all his tools in a forth-inspired language he's designed himself: https://wiki.xxiivv.com/site/uxn.html


I made an online environment for UXN programming, assembling, and emulating the CLI environment using emscripten

https://metasyn.github.io/learn-uxn/

check it out if you're learning uxn or just want to try loading a rom and seeing the source or modifying it to learn


I'd recommend checking out their YouTube channel — they document sailing from Vancouver to New Zealand and back! A lot of this work was done along the way.

https://www.youtube.com/c/HundredRabbits/videos


I'd like to point out that Rekka is also a creator of Orca.


Here is another nice short interview on esoteric.codes:

https://esoteric.codes/blog/100-rabbits


Heh, Forth isn't a write only language if it can be recognized. Happy accidents.


A consequence of using block-justified texts is creating gaping chasms or rivers of white between words, that looks amateurish.


It's not a consequence of that, what you're describing is a consequence of poor justification algorithms + no manual justification. For books, generally want block justified text without rivers, that's why you need a good tweakable algorithm, which is what GP is complaining about. Lack of that makes it problematic for use in professional publishing work

Even with very good justification tools (InDesign's being the obvious example, as it's a direct competitor) a professional end result still requires a lot of manual work -- adjusting tracking and kerning, actually editing the text. But without that solid basis it's entirely slow manual work.


I am sure the ROI would be Amazing!


I hope this means i will be able to buy a graphics card soon.


You can buy them right now. In fact, two months ago I bought six 3090s and 15 3080s with a two phone calls and maybe twenty clicks.

The part you're missing is the price. It's my secret, but I'll share it here. You can buy a Dell R12 with one of those cards and, upon receiving it, sell the components for more than the purchase price.


Picassa was the best thing miss it so.


Upvoting because of spicy sailor talk.


For a 2x speed up, I am not sure I would be willing to sacrifice legibility as in the example. The OOP method definition reads like english.

The article suggests the true benefits of DOP aren't all that great unless you understand the target architecture.

I feel like the pendulum is at its new zenith.


As always, it depends on the problem you're solving.

A 2x speedup in the inner loops of a game can be a very big deal. And for business workloads, I occasionally spend time babysitting clusters running batch jobs. A 2x performance increase in the right inner loop might save a couple hundred dollars per run.

So certainly, profile before you optimize, as the article demonstrates. But 2x speedups can be worth applying a "struct of arrays" transform.


The interesting thing to me from these patterns is the possibility of using proc_macros to write code in the "array of structs" style while the code gets desugared to "struct of arrays". I don't know how beneficial that would be, but having it as an option would make this pattern more approachable to people that would otherwise not use it, due to verbosity, mental model mismatch or any other reason people might have not to do this.

I don't think Rust itself will ever incorporate this feature into the language, just like it doesn't transparently wrap a `dyn Trait` into a `Box<dyn Trait>`, but I would expect that it will never put active roadblocks for libraries to extend the language in such a way. (The regular proc_macros caveats would apply, of course.)


It seems likely to me that Rust will use ECS queries as the idiomatic way to write struct-of-arrays code.

In archetype-based ECS libraries (which the ecosystem seems to be converging on or near) you write a query asking for a particular subset of "component" types, where each type is stored in its own array(s), and then for each iteration of the query you get back one reference to each component type.

As a result, all the extra zipping (which I'm assuming is what people find less readable about the article's example) is handled in the query implementation, and you get this sort of hybrid between the two, syntactically speaking:

    for (location, velocity, acceleration) in players.query::<(Location, Velocity, Acceleration)>() {
        *location = (
            location.0 + velocity.0,
            location.1 + velocity.1,
        );
        *velocity = (
            velocity.0 + acceleration.0,
            velocity.1 + acceleration.1,
        );
    }
Some libraries even let you write a query as a struct with a #[derive] on it, rather than a tuple, so you can use essentially the same syntax as the "OOP" example from the article, rather than destructing the "fields" up front.


Tiny nitpick: that turbofish syntax wouldn't be possible due to lack of support for variadic type parameters, but otherwise this seems reasonable.


Whoops- the actual libraries I'm thinking of wrap those args in a tuple. I've updated the example.


> The interesting thing to me from these patterns is the possibility of using proc_macros to write code in the "array of structs" style while the code gets desugared to "struct of arrays".

This is not possible in the general case, because you can have a general pointer/reference to a single struct in an "array of structs" but a "struct of arrays" can only be accessed by indexing each array separately. So only very simple and self-contained programs can possibly be "desugared" in this way.


In "struct of arrays" all the arrays have the same size and indexing, so the array index into "array of structs" is the same as the index into each array in "struct of arrays".

A general pointer/reference type that might point to a single struct in "array of structs" can be transformed to a "wide pointer" representation as pointer/reference to the array and index into the array.

If analysis can't prove the pointer is constant, then the representation will be an opaque wide token outside the transformed code, which is fine. If it can prove the pointer is constant, the run time representation does not need to be wide, as it's just the index, also opaque outside the transformed code.

There's no need to have a representation as a regular pointer, because there's no way the struct pointer/reference type can be dereferenced outside of transformed code anyway, only passed around.


This is Jonathan Blow's goal with his in-development language Jai, see for example this video:

https://www.youtube.com/watch?v=YGTZr6bmNmk


I think the data-oriented version is very appropriate for a game, where performance is king and there's not that much maintenance afterwards once its done. Perhaps I'm wrong but games seem to be very much a finish and toss it over the wall sort of project. Much less iteration than other contexts.


That is changing over time as post launch support grows and some games shift to games as a service


Are the "games as a service" games adding significant code after launch? I think most of the performance critical stuff is in the engine and core game loops. Content packs and seasons aren't likely to change those core bits, and instead be more assets and scripting intensive.


Destiny 2 just had to announce a complete re-do of how they're doing content now and in the future, because the game is significantly held down by technical debt from previous content.

https://www.bungie.net/en/Explore/Detail/News/49189


Yes they do quite often. A certain amount of maintainance (bugfixing and optimisation) is required but also you are improving core systems to enable new capabilities and increase other developers productivity.


I think DoD can be ergonomic, actually, if you're doing it just for the architecture reasons vs. the optimizations reasons, to the point that you choose DoD due to them. At least I personally have found that eg. an ECS architecture can lead to separations and structurings of logic code that were helpful, while still defining data in ways that I could just make accessible to tooling and so on.

In this talk on their architecture for Overwatch: https://youtu.be/W3aieHjyNvw (highly recommend) you can see that the wins / concerns they talk about have to do with architecture more than perf. Big insights at 8:00 in the video.


The cynical anti-intellectualism in this thread is bracing.

All part of the zeitgeist I guess.

News is fake. Science is fake. Schools are barriers. Everything is subjective, objective reality is nonexistent.

How do we have productive disagreements going forward?


> News is fake. Science is fake. Schools are barriers. Everything is subjective, objective reality is nonexistent. How do we have productive disagreements going forward?

Funny you describe it that way. I'd argue that young people in STEM fields, including IS/CIS/CompSci undergrad programs, think everything can be objective when that clearly is not the case.

You don't need to go to college to press buttons, fill out spreadsheets, or input code until you get the output you seek. You need to go to college to make the subjective decisions, which don't have a clear right/wrong answer.


I can't word this in a non-snarky way, but it's a genuine question:

Why do you believe that college can teach making subjective decisions?


Because moral philosophy and epistemology have been a thing for well over 4,000 years, and college could easily teach the basics.


This.

The humanities are glossed over at best in public high schools across the United States. I don't know of one that requires a PHIL intro course of students.

College is not vocational training (unless you're a law or medicine student), it's for learning how to think.


If you study the International Bacalaureat (IB), a high school curriculum taught around the world that is based on the French system, you are required to study Theory of Knowledge; effectively an introduction to philosophy.


> The humanities are glossed over at best in public high schools across the United States. I don't know of one that requires a PHIL intro course of students.

I'm not sure how much weight this argument holds.. The whole "gen ed" thing is a rather US-centric concept.

I don't know of any universities in the UK that require a PHIL intro course of students. When you go to university, you overwhelmingly study the one course ("major") that you picked beforehand. There's often a small amount of room on many courses for optionals from other fields, if you want to take them, but this is by no means mandatory and I'd say the proportion of folks doing philosophy modules studying a different degree at my alma mater was slim.


No wonder the UK's app startups are even more ridiculous than the US's app startups ;).

Yes, gen-ed is ubiquitous in the US. If you're in any humanities related program in the state I live in (Texas, so that's probably 25-30 large universities total), you'll have to take an intro PHIL course at least, which will probably be Plato and a random survey of 19th century European readings.

More is highly recommended for students looking for law school admission after their undergraduate degree at the state-owned college I attended.


In theory, it would be good to have a part of our educational system which teaches people how to think. But what does that look like? I'd say that boils down to two things: logic and evidence.

The vast majority of philosophy classes are absolutely garbage at teaching either of those things. Sure, in theory, logic is part of philosophy, but in any of the philosophy classes I've taken, we didn't talk about logic. The things we did talk about were often examples of how not to think, yet they were presented as equally valid next to much more rational ideas.

For example, one of the things we covered in my ethics class was Kant's categorical imperative. The categorical imperative falls over instantly if you apply even the most basic logic to it, but no mention of this was made in any of the course discussion or materials. I'm sure lots of people walked out of that class thinking that the categorical imperative was a perfectly reasonable way to make ethical decisions. If this is the sort of "learning how to think" philosophy classes are doing, then I'd prefer we didn't--I'd rather let people figure out how to think on their own than to teach them unequivocally incorrect ways of thinking. Philosophy could be useful if these classes were taught as, "Here's a bunch of historical ideas, and here's how we apply logic to prove them wrong." But until that happens, I'd strongly oppose introducing any more philosophy to curricula.

Other fields are better-equipped to teach people logic and evidence. Science is all about evidence collection, and logically applying the collected evidence to the evaluation of hypotheses. Math, especially around proofs and derivations, is all about logic, and probability and statistics give you tools that are very broadly applicable. History, if taught well, teaches you how to logically analyze artifactual evidence and logically contextualize the present in terms of the past.

But, there are two problems: first, many college students don't focus much on these areas. And second, the parts of these fields which I mentioned aren't particularly well taught even by these fields. Many students get A's in science classes thinking that science is memorizing a bunch of facts about chemicals or living things, without ever having learned how to obtain new facts themselves. Many students get A's in math classes having memorized a bunch of formulas without being able to derive even basic proofs. Many students get A's in history classes having memorized a bunch of historical events, without knowing the difference between primary and secondary sources, and without ever considering that an author might have bias. Even the classes which do teach people how to think, to some extent, generally do a piss-poor job of it.

That's not to say that these fields (and other fields not mentioned) have no value. Even if you think well, your thinking is only as useful as the evidence you feed into it, and colleges do a very good job at moving vast amounts of evidence on a variety of subjects into people's brains. Further, colleges often do a lot of work getting people skills: lab techniques, using computers, effective communication, etc. You can argue that the purpose of college is learning how to think, but the implementation of college is much better at teaching people information and skills. Learning how to think would certainly be valuable, but de facto it's not what colleges are doing, and the things colleges are doing do have some value.

That said, modern colleges often put teaching of any kind behind profits, and that's not something I see any value in for society or students.


I agree completely on your critique of profit motive at universities, and think it particularly applies to state-owned institutions. There is a false notion that profit is an automatic good, when that is clearly not the case.

There is more to critical thinking than formal logic, I'd argue. The classroom format of a typical humanities college course has a lot to do with this. For example I would argue that a person without any background in the classics and some basic theology is going to struggle to get much from Paradise Lost. It's dense, difficult, and you have to know some things going into it to pick up on the nuances. But 30 people discussing it collaboratively 2 or 3 times per week, with the expert professor's guidance when they stumble on certain parts makes for a lot of interesting discussion. Thirty different people will pick up on thirty different bits and pieces in every classroom session.

I'd guess that a big part of the reason there is such a glut of humanities graduates who can't find professorships is that people simply enjoy the classes enough to keep going all the way through graduate degrees. You get discussion and debate in those classes that you can't find anywhere else.

I don't think the above is true of many other disciplines of study, with so many degrees offered being pitched for purely profit motive as job training, as you mentioned above.

I can't do a better job of describing this than this professor who puts his public lectures on youtube for free..

https://www.youtube.com/playlist?list=PLpO_X3dhaqULiItXg84O9...


> There is more to critical thinking than formal logic, I'd argue. The classroom format of a typical humanities college course has a lot to do with this. For example I would argue that a person without any background in the classics and some basic theology is going to struggle to get much from Paradise Lost. It's dense, difficult, and you have to know some things going into it to pick up on the nuances. But 30 people discussing it collaboratively 2 or 3 times per week, with the expert professor's guidance when they stumble on certain parts makes for a lot of interesting discussion. Thirty different people will pick up on thirty different bits and pieces in every classroom session.

Well, if you look at literary criticism, there are a bunch of different ways to do it. The oldest ways, such as authorial intent or historical criticism, aren't that divorced from history as described in my previous post, or from just normal old formal logic. But a lot of the ways popular now, such as Marxist criticism or feminist criticism, are forms of reader-response criticism. In the worst cases, this sort of criticism can be used as a pulpit for professors to pass on their ideologies, which is deeply problematic--rather than teaching students how to think for themselves, it's teaching them to think like the instructor. In the best case, it can teach students how to evaluate literature in relation to their own goals--but I would argue that this is just an application of formal logic. The reality, in my limited experience, is neither of these extremes--classes I've taken and my friends have taken have mostly been "these are some of the ways people have thought about literature"--it's more about passing on information than about teaching how to think.

As I've said before, there's a lot of value in giving people information, I just don't think it supports the "college is about teaching people how to think" narrative.

That said, I'll give two caveats here:

1. My own formal training isn't in literary criticism, and beyond some general-ed requirements and second-hand experience from friends/partners in literature programs, I have very little experience here. My impressions here may very well be off-base, which is why I didn't mention literary programs in my previous post. A notable bias in my previous post is that I talked most about the fields I'm most familiar with.

2. Up to this point, I've basically been speaking about teaching facts versus teaching how to think as if they were two different, mutually exclusive things, but it's worth noting that that's not quite true. It's true that simply giving a student a fact doesn't teach them how to evaluate whether something is a fact, but if you give a student enough facts, eventually they come across discrepancies and begin to experience cognitive dissonance. Over vast swaths of facts resulting in a few discrepancies, a student will eventually begin to come up with their own framework for evaluating discrepancies, and hopefully that framework will look a lot like formal logic and evidence collection. I'd argue that this is a very, very inefficient way to teach students how to think, but eventually I think it does work.


For example, one of the things we covered in my ethics class was Kant's categorical imperative. The categorical imperative falls over instantly if you apply even the most basic logic to it, but no mention of this was made in any of the course discussion or materials

I've read this a couple times, I'm curious about what you're saying here, do you mean that your class just reviewed some writing on the categorical imperative on its own, or read Groundwork of the Metaphysics of Morals?


I'm not sure what you mean by 'teach the basics'. Certainly, college can provide the texts and an environment where other people are interested in the same subjects. If that's all you mean I agree, though it's far from the only institution where that's possible.

The trouble, I think, is that making ethical judgements requires wrestling with ethical conundra oneself, and that is not something a professor with 300 undergrads to teach can provide any useful guidance to the majority of them for. The idea of accurately assessing performance is even more unrealistic. Maybe it's a function of the kind of university I attended, but the vast majority of my fellow students who were taking these 'subjective' courses were simply gaming a rubric in their writing. And this is true even of those who were genuinely interested in the subject matter, they saw it as a price of admission.

Which seems to me like an impediment to actually learning what was traditionally taught on more of an apprenticeship than an industrial model. If your own undergraduate experience was different, I'd be curious what your university did differently.


> making ethical judgements requires wrestling with ethical conundra oneself, and that is not something a professor with 300 undergrads to teach can provide any useful guidance to the majority of them for.

I don't remember a lot from my undergraduate course on Ethics, but I do recall that literally in the first lecture, the professor presented us with questions about things like how should one behave or treat others, and then presented us with "edge cases" that directly challenged what most of us had answered.

As a young person, it's very easy to think that our problems are novel and unique, and the ethics course very clearly showed that many of these problems are millennia old, with people having given names to better-realized versions of what most of us think of as the way we should behave, and that people have spent lifetimes of work writing and arguing about the ramifications and "edge cases" of such philosophy.

I feel like the biggest benefit from the course was not any particular ethical guidance, but rather the challenging of our beliefs, and the realization that these things _are_ hard, and are not something we can trivially answer with something that fits on a Hallmark card.


Do you really think new college students absorb all that they hear in the entry-level classes? You're also operating under the assumption that all professors employed by a college are capable of effectively communicating the topic they're supposed to teach


>I can't word this in a non-snarky way, but it's a genuine question:

>Why do you believe that college can teach making subjective decisions?

Lol, most people actually believe that common-sense can be taught to people. I don't.


I agree. Developed is one thing, even building a space where it's easier to develop, but teaching? As a guy who's taught for both fun and profit, I don't buy it.


Anti-college is not anti-intellectualism. Quite the opposite. Higher education is no longer about education; it's about profit. Pieces of paper are pointless for anything other than wallpaper. Free-range education through meeting and collaborating with others is more beneficial to expanding your knowledge than handing over money to some college. Save those tens of thousands of dollars and years of your life. Spend that time and money being an apprentice, creating your own curriculum, or taking specific training.


> Anti-college is not anti-intellectualism. Quite the opposite. Higher education is no longer about education; it's about profit.

In the US, maybe. Do people take ridiculous loans for their degree outside of US? Some loans, sure, but loans that amount to 5-10x their future yearly income? I don't know...


While there are some truly staggering examples of US college loan debt, the average loan debt at the end of a 4 year degree in the US is $26k or about the price of a new mid range car. For the majority of people, their total college loan debt is below a single year of their first year annual income out of college and certainly not 5-10x.

[1]http://www.collegescholarships.org/loans/average-debt.htm


> In the US, maybe. Do people take ridiculous loans for their degree outside of US? Some loans, sure, but loans that amount to 5-10x their future yearly income? I don't know...

Yes. In the UK. I had a relationship with someone who specifically learned German in school, and went to Germany to tutor in a cross-education outreach program after she graduated to go scout for Universities she wanted to attend in order to avoid having to take out massive loans like her siblings did back home. Very smart girl.

She and I enrolled into online classes, I had already complete my Bsc but wanted to do this with her; but she felt she was missing on the 'campus life' part of the University experience and went into Pedagogy to the Masters level and now teaches back in the UK.

In a post Brexit World, that is just not possible.

The EU is still pretty favourable in terms of University costs being hidden and obfuscated via VAT for the students, but many Industries within it's local economy (PIIGS, Romania, Hungary, Slovenia in the Eurozone, and just about most of the periphery member nations) cannot provide adequate jobs let alone a career to its graduates within their sectors so they have to go to Germany, UK, Holland and as things have gotten worse France to a much lesser degree than when I was there.

The ideal being landing a job in the US or China where they can make obscene amounts of money in certain fields like Tech or Medicine with little to no debt, and subsidized advance degrees. Which still opens it up to the work visa lottery, and uprooting your life during some the most critical years of your entire Life (late 20s to early 30s) in the hopes it pans out.

The best thing that can happen is to disrupt it entirely and level the playing field and re-structure it in such a way that its both affordable and accessible to all motivated to want to go in and meet its requirements. And incentivize them to stay in their home towns a build a solid community and tie it to the needs of its actual needed labor force: hopefully doing away with the notion of studying Civil Engineering for Oil Rig drilling if you're from Iceland kind of thing. As it makes no sense, and doesn't reflect the value system or the job prospects of your community let alone the job prospects of a Nation that is entirely dependent on renewable geo-thermal.

How exactly the Lab portion of STEM gets solved is still a mystery.

I propose the building auxiliary wet-labs in Libraries within their communities. The net benefit here being that students should be required to teach children and adults of their community the topic or subject they are studying as a graded portion of their grade for the privilege of having such a model and build community in the process. Or perhaps that should be the only real on-campus (at both Universities and Community Colleges) component to what is an otherwise entirely Online system?

Just look at this example, which having to attend my midterm and practicals during one of the largest fires in San Diego History (I was literately trapped in my car on my way back home to OC for 7 hours after they closed campus when we were sitting down for the exam as the classroom filled with smoke) and during my finals during the H1N1 swine flu pandemic, I can understand this from both sides:

https://ktla.com/news/local-news/ucla-professor-suspended-af...

That hot button issue could be entirely mitigated, whether you're pro or against the BLM protests is irrelevant. On just a practical and logistical matter you could just overcome this with the current technology that we have and avoid the certain backlash to the professor, department because of it from the irate student body and opportunistic Media.

I saw a rant from a UCLA professor pretty much lining out how he, and his entire profession have not seen a single decrease in pay since he left University in the late 80s as a TA and saw how the CSU/UC extortion system was being assembled in what was once the envy of the entire US' university system--which followed the EU's model pretty well, and was low to no cost if you were local, but had the ability to employ its graduates as the California Economy could support it. Which was a net benefit that significantly contributed to CA becoming the 8th largest economy in the World.

I can't seem to find it and really wish I had saved it as the very employees in the system are to the point where they know it went too far. And are perhaps even afraid of what may happen at what an angry mob can do these days.


I agree with you, but the primary issue is signalling via degrees. For the elite non-college educated who already have a working portfolio of projects to reference, landing a "white-collar" job may be a possibility. For the rest of them, a non-degree holder, even if objectively competitive with a degree-holder, will be immediately discounted by a hiring manager who's looking at 250 resumes.


Productive disagreement? I think to begin, we need to vet people we might have disagreements with. I don't think productive dialogue can be commodditized. Maybe it can after the fact via podcasts etc, but dialogues themselves are, I think, inherently analogue and highly personal. I think, given the realities of the attention economy, that we need to be much more selective about whom we let in to a conversation that might change our own mind. And this needs to be on a case-by-case basis, with everyone setting and updating their own standards on whom they'll let into dialogue.

I think people like the one you're responding to would agree and increasingly think that association with institutions of higher learning send a strong signal to avoid dialogue. It doesn't necessarily look like anti-intellectualism to me, any more than filtering out people who didn't graduate high school is necessarily elitism. I could see myself rationalizing either, depending on the kind of conversation I wanted to have.


Could you please define anti intellectualism by your standards? To me, anti intellectual notions would be just if the initial intentions of a discussion were based on a very rich topic and were dissolved in some manner. I beg a brief and graceful peer review for the poor submissions that likely seek solace in the isolation of a virus infected planet.

And at end. In my intention to post, I was solely being altruistic, informing whomever reading that if they were to read this article and consider getting a degree from U of the P that they should consider the risk. Just a gesture. However, I think my writing style might have been misunderstood as some semblance of pseudo intellectual attempt or such. Do know, for the record,that as the 1st to reply to the post, my intention was to inform.

But I am intrigued and inspired. How about we both try to post an article that invites our versions of intellectualism! Ready set go.


>"How do we have productive disagreements going forward?"

We are barely having any of those right now in the greater society. As long as we can't argue facts, objective-reality and do so without feelings, we'll continue descending into anarchy and divisiveness.


US is "the greater society" now??


I'm not from or residing in the US. My comment was aimed as commentary on what I can see happening globally.


Your post lacks nuance.

Some news is fake, some isn't.

Some science is fake, some isn't.

Schools are barriers, but for many elements of a school, the fact that it's a barrier is a good thing--we don't want ignorant people performing in roles that where knowledge is required. The problem is that many elements of schools are barriers which are poor at achieving their purpose, or are directly counterproductive to their purpose.

> How do we have productive disagreements going forward?

That's a complicated question, but oversimplifying the opinions of people we disagree with and then labeling it ("cynical anti-intellectualism") isn't the answer.


Or: Objective reality might exist, but it's not necessarily present in the universities, which are mostly there as an IQ test by proxy with a filter for the most lazy, with some social indoctrination thrown in. Any science or truth exists only at the whims of the social order of the day.


>Any science or truth exists only at the whims of the social order of the day.

This is not objectively true but I understand what you're trying to say. I'm sad to hear that your experience of science and truth has been only that which society has given you, or at the least that you feel that others are only experiencing it in that way


Sorry, I should have worded that more clearly: Only that science or truth that is doesn't contravene the current social order is allowed air, allowed to be talked about without reprisal and shunning. The degree to which this is true is a canary for how totalitarian and neurotic your micro-society is. The micro-society of universities generally seems to be becoming more, rather than less, rabid and paranoid.

A lot of it is like Bostrom's idea of the decentralized electroshock dystopia: even though a significant proportion of people are witches, everyone's afraid of reprisal for not actively hunting witches, so the witches-in-hiding hunt their own when they're unmasked.

But this is the way of things; this wave will pass, eventually, as well. And like the soviet scientists who kept their heads down and mouthed the party line, the secret iconoclasts will survive till the current order is replaced by the next, with its own peculiar tabboos.


This is a crucial argument. Do you have a link to something that supports your statement that “this is not objectively true”? I think it would help a lot in these debates.


Yeah it's called "The Barstool Experiment". Look it up :-)

The short summary is: Two people sitting in a bar, discussing how there is no objective truth, or that "science or truth exists only at the whims of the social order" and stuff like that, all day long. And then a third person comes along and smashes their brains with a barstool. In some variations, the first two people are a scientist and a priest. But the third person is just some dude with a barstool.


Is there an open directory of radio stream URIs?

Like if I wanted to build my own tiny radio streaming client, where would I look for the streams?

I tried snooping the network tab in dev tools on Radio Garden, but couldn't see many requests that weren't either grabbing map tiles, or connecting to Radio Garden itself.


There is http://dir.xiph.org for Icecast radio stations.


Thank you!



Thanks!


Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: