Hacker News new | past | comments | ask | show | jobs | submit login
TinyML and Efficient Deep Learning Computing (efficientml.ai)
239 points by samuel246 12 months ago | hide | past | favorite | 37 comments



I highly recommend the tinyML Talks to anyone interested in pushing edge compute past the last mile they have a huge library and a full upcoming schedule most of the slides are available so there is a huge amount of content it's amazing what you can get running on embedded systems.

https://quip.com/MENbAvuQkrb0


I'm curious if anyone knows people who would help me bring some efficient ML project work I've done to spheres like Africa. I've visited a couple times and loved being there so much, but definitely feel extremely disjoint from that part of the world. It would be really nice to me to help make some of those connections. <3 :'))))


Hi Tysam_

Thanks for your intreset in helping others with your knowlegde , i would greatly recommend you to deeplearningindaba https://deeplearningindaba.com/mentorship/ , its one of the biggest ml communities in africa and has a host of students, companies and mentors


Thank you, very much appreciated, this means a ton. Thank you for helping me make the connection, I will look into this. <3 :'))))


Could you write to a university near where you’d like to work and ask them for some advice too?


I think you may have replied to the wrong comment (I might be misunderstanding, however).


I think the suggestion is to try to collaborate with local universities. From experience, a lot of research/fieldwork in Africa is only practical with local experts and communities.

Also look into attending or supporting the next Deep Learning Indaba. This year has just finished I think. It's one of the largest African ML communities and they have a mentorship program.

https://deeplearningindaba.com/2023/


$25 billion endowment, and we get the audio quality of a 1930s-era wire recorder. Annoying.


OOC, what are you listening to? The youtube playlist has 2 videos for each lecture, a zoom recording focusing on the slides with the professor visible on camera, and a classroom recording - the latter definitely has better audio quality, but the audio quality of both is far better than your average OCW lecture from 10 years ago.


It's ironic that the audio quality of most AI online courses is terrible.I'm looking forward to the moment when AI can be used to improve the audio of these lectures, especially older ones like the Feynman lectures.[1]

[1] https://www.youtube.com/watch?v=-kFOXP026eE&list=PLS3_1JNX8d...


It did occur to me that improving the audio might be a good final exam. Better suited to a DSP class than an ML class, though.


with subtitles it's bearable


I feel like this should be at the forefront of thinking on ML. Seeing how many computing resources Big Cloud companies are planning due to demand for ML infrastructure is pretty concerning from an energy use standpoint.


If we care about energy use so much, why is Las Vegas gambling industry allowed to exist. Or for that matter Shein. It's like we are doing premature optimization of one aspect of economy (tech), while other industries are given free rein.

https://time.com/6247732/shein-climate-change-labor-fashion/


I agree and we should be concerned about those industries too.


Work from Home could be planet saving (saving so much of fuel and time wastage) but top management needs to feel important. Why aren't journalist raising these talking points?


This is a false dichotomy.

Bleeding edge research work should not be hindered by premature optimization concerns. Take quantization for example. Before people were able to train a model with the usual floating point precisions, nobody knew that INT8 or q4 quantization was feasible. (In fact, nobody would have a full precision model to compare performance with.)

Also, the idea that energy efficiency should be a top concern basically undermines the whole idea of developing new technology. It's obvious that if fancy things are to come out of the research, it's going to cost more energy to run than not running anything at all. That itself is an argument that if we don't want energy usage to keep ramping up, we should shut down ALL research that potentially give us new energy-depleting toys.

So, really, I'm personally not concerned with "one-off" resource usage if they advance human understanding of the state of the art. Since energy actually costs money, capitalist pressures will make people think of ways to save energy (and time). The moralistic arguments are just misguided in the big picture. IMHO it feels like luddites putting on the environmentalist hat here.

Instead of shaming machine learning researchers over their energy use, it's probably more effective from a energy use standpoint (for example) to ban "proof of work" schemes in cryptocurrency.


Instead of shaming machine learning researchers over their energy use, it's probably more effective from a energy use standpoint (for example) to ban “dirty” energy plants.

Banning a class of high energy consuming algorithms used towards a non-violent technological development you disagree with makes absolutely no sense to me.


> Bleeding edge research work should not be hindered by premature optimization concerns.

isn't the ingenuinity in doing things efficiently rather than just turning off the "scarce resources" setting? really i wish someone would explain to me what is the substance of "bleeding edge research" wherein nothing has been optimized except the flow of grant dollars from DoE/NSF to NVIDIA. like what is the research in "we burned O(1,000,000) compute core hours to train a model". i was recently in a discussion with ANL people about rick stevens next big idea - "the trillion parameter consortium" - where he plans to just commandeer aurora for a month. lol.


Couldn’t cryptocurrency proponents make a similar argument?


OK, I will: it's illiberal to ban a computation on the grounds of the pollution emitted by some ways to generate energy. If the pollution is bad, you want to discourage the bad way to generate it, and encourage the good way. If this makes a computation more expensive (in those places where less-pollutey power is less available), that will discourage that computation if and only if it's not valuable enough to the person choosing whether to run it.

A human may emit more CO2 than a computer doing the same work. As AI improves, this will happen more and more until they dominate humans by this measure. Then you'll have established a principle justifying a ban of biological humans.

If you believe in human rights, it's mostly bad to put prior restraint on people's use of computers, just as it's bad to restrict writing, talk, and privacy. The restriction of this kind that makes the most sense to me is on the development of superhuman general intelligence, until we understand better how to avoid that ending humanity.


No, how does a lot of crypto mining infrastructure now reduce the cost of cryptomining 5 years down the line?

I will say though that at this point in time it's ridiculous how little research has gone into reducing the energy use of networks.


Yes they could. Does that make it harder to reason about?


> Also, the idea that energy efficiency should be a top concern basically undermines the whole idea of developing new technology.

I’m sorry but this is such a dumb argument. Of course new research will burn energy - there’s no question about that. To suggest that ML is like any other development, when it’s well documented how much power these algorithms (LLMs for example) use is kind of bad faith.

The argument is not that we should halt technological progress, but in doing so always be considering its impact on the planet and evaluating if it is really worth it or not.


> Bleeding edge research work should not be hindered by premature optimization concerns

Except nothing here is hindering the ability of scientists to develop whatever new technologies they want in their labs.

  > shut down ALL research
  > shaming machine learning researchers
You're being sensationalistic. They're not banning super computers. AI research isn't facing an existential threat.


  > shut down ALL research
  > shaming machine learning researchers
It’s quite hard to make any suggestion to a researcher these days, especially AI researchers who think they know it all.

Even calling research work as “research” is not well received many times :)

This is one of the reasons I’d like this bubble to burst soon, so that we can focus on applying the resources we got efficiently, rather than pursing - many times pointless - “bleeding-edge” research.


> Since energy actually costs money, capitalist pressures will make people think of ways to save energy (and time).

This and similar facile arguments are getting tiresome. They seem predicated on nothing but the most basic Econ 101 understanding of value. Nothing exists in that idealized world -- in reality, complex mechanisms keep this kind of excess spend relevant (marketing and public image, "first to market" concerns, sunk cost spending, and a million more) regardless of more material considerations, so to lean on this trope is not really up to the standards of HN discussion in my book.


Do you hold the same opinion for crypto?


As you know, energy efficiency is not related to research and training only, but increasingly to inference and productionizing of the models. This is mostly what this course emphasizes too.

It's not about luddites putting on the environmental hat, if environment was the main concern your capitalist pressure couldn't care less (\o/ externalities). It's about not draining phones batteries and not racking up datacenter bills.


Video encoding uses a *lot* of energy, Youtube, Netflix and friends.

And capitalistic force are not fast enough and d not take environmental impact into account has carbon taxes are not there or not high enoigh or lobbied against or not measurable and gamed.


CRT TVs used a lot of energy. It’s getting better.


DeepSpeed and Hugginface Accelerate already include a lot of these features and they’re indeed already used in production. This class would be a great intro to said features given most people barely have access to one GPU let alone hundreds.

Moreover, the class also covers topics related to using and hacking with foundational models of all you have is the model versus a cluster.

Energy used to train foundation models is indeed extreme but the gross expenditure is more a function of corporate spending and competition than deep learning technology.


> is pretty concerning from an energy use standpoint

I would agree that efficient LMs should be a focus, but a framing like that is too pretentious and misses the point IMO. I'd expect Apple to heavily double down on this point though, because that's what they always do (see the latest Apple event for more).


Initiatives like green data centers are there which are designed to minimize environmental impact



seems that the course doesn't include Google's highly efficiemt "step by step distillation" method, which was discussed on HN yesterday


They dont have this semesters problems online do they? Just the last semesters? Wondering if i missed it?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: