Hacker News new | past | comments | ask | show | jobs | submit login
Get started making music (ableton.com)
2106 points by bbgm on May 9, 2017 | hide | past | favorite | 461 comments



Anecdotal: there's a few different approaches to learning songwriting that seem to click for beginners. The "build up" approach is the most common and is what this link offers: It first teaches beats, then chords, then melodies and then, in theory, vocals etc. These lessons in this order make sense to many people, but not everyone.

If you're interested in learning to make music and the lessons in the link are confusing or overwhelming or boring, some students find a "peel back" approach to learning songwriting easier to grasp at first. A peel back approach just involves finding a song then teaching by stripping away each layer: start with stripping away vocals, then learn melodies, then chords, then finally learn about the drum beat underneath it all. A benefit of the peel back approach to learning is melodies and vocals are the memorable parts of a song and easiest to pick out when listening to the radio so a student can learn using songs they know and like. Either way, songwriting is hard and fun. Best of luck.

P.S. I think Ableton makes good software and I use it along with FL and Logic. They did a solid job with these intro lessons. But worth mentioning, there is free software out there (this includes Apple's Garageband) that offers key features a beginner just learning songwriting can practice on and mess around on before purchasing a more powerful DAW software like Ableton.


> there is free software out there

If anyone is interested in a Free/Libre/Open Source Software option (cross-platform Linux/Windows/Mac) I've really enjoyed producing with LMMS over the past 18 months or so: https://lmms.io/

It's definitely got room to grow in terms of functionality/interface but the development community is of such a size that it's possible to still make meaningful code contributions. I've contributed a couple of small patches to improve the Mac UI as a way to get familiar with the code base.

Of course, the downside is that I have to decide whether to write code or make music whenever I sit down to use it. :)


There's also a new project called Helio Workstation https://github.com/peterrudenko/helio-workstation it doesn't have much built-in instruments so you need plugins for everything, but the UI looks awesome


I like LMMS but it has enough vst issues that I just went with a cheap Reaper license. Plus I need an audio recorder as well.


I wouldn't say that "LMMS has vst issues". I would say that vst has a serious issue: it is not a open standard. Although it seems they are trying to improve that for the linux community: http://cdm.link/2017/03/steinberg-brings-vst-linux-good-thin...


Good to know. That said Reaper enabled me to use the 7-8 vsts I was really interested in using that didn't work (or worked poorly) in lmms. If I new c++ better I would contribute.


For recording, there's Ardour (among several other FLO options)


The Song Exploder podcast is awesome for both approaches, and I would really recommend anyone interested in writing and/or producing music to give it a go.

"A podcast where musicians take apart their songs, and piece by piece, tell the story of how they were made." @ http://songexploder.net/


Seconding this recommendation. The more electronic- or instrumental-oriented songwriters on the show tend to get more into the nitty-gritty details of production, layering sounds, etc. (the first two episodes with The Postal Service and The Album Leaf, for instance) while rock and singer-songwriter types tend more towards the songwriting aspect (the Long Winters is a personal favorite).

I sort of wish there were more technical details as a rule, but it's understandable given the relatively short format that they can only cover so much ground. I'd prefer longer episodes personally, but I suppose not everyone might, and there's tradeoffs in producing more content. I guess I'm just glad that the show caught on and is still going strong.


+2. Only podcast I listen to. The first few episodes are a little rough around the edges, bad questions, people not knowing exactly what they were getting into, but the rest are all incredible.

Protip: sample the guest's clips they put on his show :) I've gotten some really great material from this show sonically since most of them seem to be the individual instrument tracks.


+1 for SE. Also there is an amazing set of Motown tracks split into instrumentals and acapellas out there. I'm not sure if it can be legally obtained easily, but it's a great master class.


Is this a common distinction to acknowledge in general education environments? You pretty succinctly described the struggles I've tended to have in my education, and described it in a refreshing/revealing (for me) way.

I love looking at systems and peeling back the layers to find out what makes something tick. That's not an approach to learning that I really encountered until I entered the workforce and was met with complex systems that I needed to understand. And I loved it!


Interesting, I've never heard of the "peel back" approach, and I can totally understand why it would be instantly satisfying for a beginner in music to get started that way. Do you have any articles or books on the subject manner?

How would this approach apply to a more traditional instrument that doesn't have the advantages of having a "good" sounding sample already preloaded that can be easily layered into a song that you are composing? I grew up learning the violin and it was endless disjointed drills until it was put together in a classical song that I never heard before nor had the desire to play. 8 year old me just wanted to play the theme song to "Jurassic Park" and roar like a T-Rex.


I think there's a difference between learning composition, and learning to play an instrument.

In my view, learning an instrument has a lot in common with learning to code, in that some people take to it, and others don't. And we probably know some of the reasons, but not all of them. Of course teachers and teaching programs vary, as do kids and their family milieu. But nonetheless, music education has huge attrition.

For instance, by way of anecdata, I took string lessons as a kid and loved it, and my kids have gotten pretty serious on violin and cello. They actually like classical music, and it probably helped that both of their parents also enjoy it. So it definitely works for some people.


What you said about learning melodies and beats and chords kind of confused me. Do people actually learn how to make up music? I always thought it was just some natural ability that people have. For as long I can remember if somebody told me to write a song I would just spit it out after a while. Am I unuiqe in this respect?


I created an account just to reply to your comment. As someone who has played keyboard instruments for all my life, it's not crossed my mind lately that the idea that there is structure to music is not well known.

Just for fun: chords in scales are numbered from bottom to top in Roman numerals. I feels like home base, V feels like wanting to go home. If you want to create the feeling of going home but then not really go there you can go from V to VI instead of I. 'Sad but I have closure'-type ending? Major IV - Minor IV - I. Bluesy feeling? Add a minor seventh to your I, IV and V chords. Dreamy? Major seventh instead there, except on the V.

It's even entirely possible to learn to recognize all of these types of chord progressions and sounds instantly. I'm working on and off on an ear training app that randomly generates them that musicians can use to train their musical ear.


>I'm working on and off on an ear training app that randomly generates them that musicians can use to train their musical ear.

Sounds interesting. Please do a "Show HN" post when your app is ready for it.


Will do, thanks!


As someone who's recently tried to get into (basic) music theory, trying out the chord progressions you mentioned was fun! Do you happen to know of any resources that go through more of these well-known chord progressions?

I'm also wondering if these chord progressions work the same way for all scales, or if, for example, the 'sad but I have closure'-type ending only sounds that way in major scales? From experimenting I think it only works for major scales, but I'm not sure :)



You probably have learned most/all of your musical knowledge implicitly.

Some people have a great ear for music and can write solid songs without formal training in music. Other folks come at music from the more theoretical side, although usually with a lot of implicit knowledge of and experience with music as well.

For most people who are not formally trained in music, their songs can be improved upon on a technical level by someone who has deeper theoretical knowledge (learned either explicitly or implicitly).

For a good discussion of this, check out Tim Ferris' podcast interview with Derek Sivers. Derek talked about how he had learned a lot about music implicitly. In one summer, a teacher of his formalized that knowledge so efficiently that he was able to test out of lot of classes (1.5 years worth?) once he went to Berklee School of Music.


Songwriting can be taught, yes. In most music courses you start by analyzing the Bach Chorales, which (along with some Gregorian work from the middle ages) is what really kicked off contemporary music composition. By analyzing the Chorales and moving forward from there you learn how to manipulate chord progressions, harmony, point and counterpoint.

Composers classically trained this way tend (!) to have an easier time writing melodies, harmonies, and progressions in a consistent manner, ie not having to wait for "inspiration to strike". The composer, of course, still needs to develop an emotional connection in the music, but the point is that it can, and routinely is, taught.


My girlfriend is a trained classical singer whereas I'm a self taught musician. She doesn't really gravitate towards the rock music I like to write but because of her training she can easily jam, riff, or write anything far quicker than I can. Songwriting is a very technical skill indeed.


What I find difficult is that by the time I've got my DAW going and found some synths I like, the tune in my head has evaporated. Do all people find musical thoughts so insubstantial, or is it just me? If I imagine a picture or a paragraph of text, it'll stick around and I can remember it more or less indefinitely. I still recall snatches of crap poetry I thought up when I was a teenager, but any music I imagine just disappears before I can get it down.

The most successful tunes I made were more or less "discovered" from incrementally experimenting in the DAW, and not from any kind of original plan or idea. Maybe I'm just not a musician! (I'm an indie game dev who started making my own tunes for my games)


you could consciously decide NOT to use synths to lay down the bones, always use a piano to begin with, once you have the tune idea down then you can move onto orchestration and picking synths and so on. Always keep the piano track as a guide and start adding tracks for all the other components until you have what you need.

From a remembering the tune perspective, I have the same issues, but I think it's more related to not applying musical lexicon and hearing skills the same way: you remember poetry or a paragraph of text because you remember the ideas and how to go from one to the other, if you are a musician and have something in your head and start thinking along the lines of "this is using a lydian mode, the progression is ii IV V I then it modulates to the relative minor and switches to dorian, also the theme is going down in thirds for two bars, then it will stay on the chord root for one and move to the dominant 7th" you are going to remember it a lot more easily than just by remembering the melody itself

It would be like comparing how easily you can remember poetry in English vs poetry in, say, Russian, where you only have the "sounds of the words" in your head to remember, but you don't have the syntax or the meanings to help you as well.


For me one of two ways works. Most often I start designing a patch on one of my synths and that ends up becoming a full song. Other times I start by noodling on the piano or organ and ending up with something I like. I suspect the more musically gifted do the latter more often, while the more technical ones like the process of patch creation, etc.


I evolved this way, though I'm far from gifted. Starting out, anything I made was driven by whatever sounds I was noodling with. Now, I almost start on the piano, compose the outline, and then pick the sounds that I think fit it.

The first approach has a sense of creative wonder to it, where your being guided by an outsider. As much fun as that is, it is very limiting and I suspect most people abandon that approach as their skill improves.


Imagine that you're a writer, and you have an idea, so you turn on your computer, wait for it to boot, log in, open up Word, and fiddle around with fonts for a bit... that's what you're doing.

Writers keep pens and notebooks by their bed so if they wake up in the middle of the night they can start writing right now. Or they have tape recorders. Anything works as long as it's immediately available. The iPhone has a "Music Memos" app, I'm sure there's something similar for Android. That's what I use.

Learning music theory and how to write music properly can come later. As long as you can sing, whistle, or hum a tune, you can record it.


I found the same to be true. I've been trying lately to give up approaching music from the "I have this idea I want to get down" perspective. Instead, I set up my studio in such a way that I can easily "play around" and come up with ideas on the fly, and then elaborate on those.

Switching from a DAW to a mostly-hardware setup helped with this, as it's easier to "play" with knobs/sliders/keys/pads than virtual objects accessed via mouse/keyboard. Once you get things wired up, it's pretty straightforward: play around, find something you like, track it in, build more stuff over it.

Ever since making this switch, I found the parts that I used to practice/enjoy (like slicing and manipulating samples, for instance) feel much more tedious.

Another benefit is that it's easier to make mistakes, which often have more interesting results than the thing you originally intended. My guess is because this violates your internal "patterns" and forces you to think outside of your normal "music creation" schema, resulting in a more creative/unique outcome.

I've also tried to switch to "totally live" recording (i.e. minimal sequencing beyond loops and patterns, all automation and non-repeating parts done on the fly), and that's a bit more challenging, because you have to redo everything if you, say, screw up a little solo bit.


>by the time I've got my DAW going and found some synths I like, the tune in my head has evaporated

That's where music theory pays off. Learning to name chords, scales and arpeggios gives your brain a framework to reason about and remember musical ideas. It allows you to break the music into a more concise abstract representation, rather than holding it in your head as sound. If you understand the structure of music, it's far easier to make connections between different pieces of music.


Interesting.

Do you have much formal knowledge of music theory? If not, that might help.

When you "get a tune in your head", if you can describe it to yourself in abstraction, it will probably be easier to remember (or even just write down).

Check on this page on 12 bar blues for some examples of easy music notations. Similar types of notations and/or terms exist for different parts of a song.

https://en.m.wikipedia.org/wiki/Twelve-bar_blues


I had exactly the same thing happen, losing the core ideas by the time I wired up the synth I wanted. So I dropped the DAW entirely. Now I create most of my music using Loopy[0] to layer the parts that I sing (or occasionally play). It's been fantastic for my creativity.

I'm starting to hit its limits for my workflow though. One of the really nice things about how easy it's getting to write software these days is that I can now fire up, say, a Swift playground, and after getting the fiddly basics of "how to record and loop audio buffers" with AudioKit, there are very few limits on what kind of idiosyncratic workflow tool I can design for myself. The UI looks and acts how I want it to, and since over the years I've trained myself to act like a human synthesizer, I can[1] compose a whole song without even worrying about having an instrument nearby.

[0]Loopy - Multitrack audio looping with very simple and expressive control https://itunes.apple.com/us/app/loopy/id300257824?mt=8

[1]The "can" is theoretical. This is my next big hobby project, and I'm still in the fiddly phase.


Maybe don't use synths for getting your ideas down.

If I have the beginnings of a song in my head, or I have been humming to myself, sometimes I just record the parts I have as vocals - humming or full-on beatboxing the bass/strings/lead/beats separately and as close as I can make them to my head-song (including filters with my mouth)- and then replace as I go, figuring out how to achieve the sounds that were in my head.


My fiance who was a professional musician (she had record deals) always keeps a Tascam recorder. When she comes up with a melody or lyrics she puts them on that until we can get in the studio and record.

She hates music theory and trying to use her left brain for art. I'll say oh that is in F and she gets mad so, easier to just let her record it than try to notate it down.


I have the same problem too - I solve it by humming/singing the melody and recording it on my phone. Afterwards, I'll find nice synths, put chords to melodies, write horn arrangements, tweak drum fills, etc.


I whistle into my phone. A few of my most complex pieces started that way.


Happens to me too. Ear training might help? There's an exercise where you take a song you know or a familiar recording and try to transcribe it by ear. (I find it's a pretty hard exercise.)


i spent 10 years building a project studio and optimizing the workflow and patch bay so that i can be recording almost any instrument within 30 seconds or so. that was a huge breakthrough and a huge commitment!


Improvising, screwing around, experimentation are great ways to generate ideas. I think most composers work this way, most generally don't start with some huge structure, they just find some germ cell ideas by improvising that then can be build upon afterwards with analytic techniques. I think there are two main approaches, the intuitive feelings and pure ear based approach which is what most people call the "talent" aspect and the analytic approach which is a lot like mathematics, it is about studying structure, and is a learned skill. The best composers will use both of these together. You should learn chord structures, scales and how to read sheet music. This will allow you to conceptualize a musical idea as a concrete mathematical object, and it will help you to not lose the idea. The reason you forget the music in your head is because you don't have enough reference points to define it in a memorable way.

You can understand a musical idea as a kind of memory impression, an echo that you can play back in your head, and also as a pattern of pitches and rhythmic structures . Having two reference points , sensory and abstract mathematical is very useful.


I think your natural composition skills are unusual, but not unique. I also begain to compose music at a very early age. My knack for picking melodies up out of the air and playing them on a piano when I was 7 years old was how my parents knew that I needed lessons. Naturally, I was surprised later in life to learn that other people had to learn basic things like pitch and rhythm; to me it had always been just as natural as speaking.

I believe the same is true with song writing, in a sense. You're still applying some parts of music theory, but most by-ear learners like ourselves simply grasp the concepts and have internalized them naturally, without needing to be taught. Music is little more than patterns at the end of the day, and our brains are very good at recognizing patterns. What you and I know intuitively, others can learn through training and repetition. Both approaches are valid, and yield interesting (and often different) observations.

I went going through Music Theory classes during my brief adventure with Liberal Arts Majors in college. I felt like I already "knew" the material in a way I couldn't quite put my finger on. It was like I was finally understanding what my brain had been doing all these years. I recommend it if you haven't yet had the experience.


>Do people actually learn how to make up music?

People have studied music and composition since at least ancient Babylon, so, well, yes?

>I always thought it was just some natural ability that people have.

With natural ability you can sing some melodies. For learning to play an instrument, adding chords to the melody, you need studying, even if you learn by yourself and by ear (as many folk musicians did). Song melody, one can have a natural feel for creating, but nobody just starts writing songs in full form "from natural ability".

>For as long I can remember if somebody told me to write a song I would just spit it out after a while.

What would that mean? You'd write a song on the guitar for example? If so, then you already know the chords. And not all of the theory, so how complex is your song? Just barebones songwriting (country/folk style)? Can you take it further? Can you write the parts for musicians to play on your song? Can you write different genres on spec?

There are more things in making music/songs than "spitting out" some melody.


When somebody asks me how to solve a particular database problem, or an IT problem, or how to write an algorithm to do something, I will think about it in the back of my mind and "just spit it out after a while," unless it's something difficult enough to warrant a literature review.

That doesn't mean that those subjects aren't covered in detail in textbooks and university courses, or that people cannot learn how to do it.


There are certainly people who have natural ability, and compose melodically, applying varying levels of knowledge in music theory.

There are other people who can't make heads or tails out of a keyboard, compose a tune in their head, or understand chordal progressions, but nevertheless compose music in layers and still do extraordinary work. They find what they like by playing with notes on the screen. Joel Zimmerman, a.k.a. Deadmau5, is an example of this.

I am an example of the former, with natural ability, bolstered by training in music theory. But I still use a layered approach when I am composing, generally starting with a beat or bassline, playing with melodic progressions in snippets, and eventually moving into a traditional composition process when I have something started that I like. Ableton makes this process extremely easy and productive.


Indeed. As a classically-trained musician, watching Joel's class on Masterclass and seeing him compose melodies by dragging notes around in Ableton until they "sound right to him" was eye-opening.


What surprised me was how he makes melody lines: Playing with chords until he likes the progression, and then pulling notes out of the chords to form a melody. And of course it makes sense on one level.

But I think melodically and tend to do a lot of counterpoint. Getting the chords out of my head and onto the screen is often the last thing I do. I don't know how well his approach would work with counterpoint, since counterpoint often creates and resolves dissonance using passing tones in double time.


Very few skills are just "natural ability". Music theory is an interesting and pretty important topic if you want to make music. Since people have been creating music for millennia, they have figured out many things that help composers.


Do you really mean to say you write songs without using any theory or explicitly sought knowledge whatsoever? Let's hear one.


I know a little rudimentary theory. Just enough to get mocked by someone with a real education. "without any theory" is sort of an impossible standard to satisfy, but I will say I never think about theory consciously, and go by how things sound. Anyway, this is what I'm working on:

https://soundcloud.com/cubit/piano-concerto


Are the free instructions like this out somewhere (build-up and peel-back)?


Is Apple's Garageband free? I thought you need to own an OSX device to run it? (my understanding is OSX only runs on Apple hardware and also is not a free OS)


Yes, it's "free" (not open source). It's included with the purchase of a Mac.


The point is, that kind of free is marketing speak. More accurately, you can purchase Garageband as part of a package including Apple hardware. Or you could say Apple hardware is free with the purchase of Garageband.


Not any more.


So no, it's not even gratis, it's $500+ depending on how crappy hardware you want it to be tethered to.


That's a slippery slope to saying that OpenOffice for Windows isn't free software either because you have to buy a Windows box. This is not a useful definition of free you're using. GarageBand is not Libre software.


I dunno if it's a slippery slope starting at Garageband. I say the slippery slope starts at OpenOffice or GoogleDocs or something along those lines, given that OpenOffice could probably be run on a potato if you can find a way to install ubuntu on it and stick some RAM into it.


That's absurd, since OpenOffice runs in Linux, and is free, as in freedom.


when it comes to getting software, we've stopped including the price of the required computer since like… 1995?


Garageband is completely non-free/libre/open which seems what you are saying


For those wondering, this is made with Elm lang, Web Audio & Tone.js [1]

[1] https://twitter.com/AbletonDev/status/861580662620508160


This is some good coverage of the music theory behind songwriting, which is important in making songs that sound good.

However, there's another part of making music which is not covered at all here, which is the actual engineering of sounds. Think of a sound in your head and recreate it digitally—it'll involve sampling and synthesizing, there's tons of filters and sound manipulation to go through, they all go by different names and have different purposes—it's a staggering amount of arcane knowledge.

Where is the learning material on how to do this without experimenting endlessly or looking up everything you see? I want a reverse dictionary of sorts, where I hear a transformation of a sound and I learn what processing it took to get there in a DAW. This would be incredibly useful to learn from.


This is something I struggle as a weekend hobbyist musician: There is some kind of black art involved in making music, in how to get that sound you enjoy on the music you like (which is probably the music that inspires you to make music, at least in my case).

What I found was that as your music making experience unfolds, you start amassing these little tricks here and there and they're only yours, usually tied to your stack of tools and the way you think. That is extremely hard to replicate and also very personal, imho that's why it's so difficult to actually pass that sound-sculpting knowledge to others, and that's why (besides the odd youtube tutorial on how to make a specific sound -- usually targeted at a specific vst, explaining which knobs to turn), we won't find many general sound sculpting learning material online. Even tho it is available if you gather around from forums and etc, it is still pretty much a personal experience.

Answering your question: As the time passed, the endless experimenting diminished and I got a proper sense of what does what, and after 5 years making music I'm more able to pinpoint what I need to fiddle to transform the sound the way I want/imagine in my head.

I'm still not quite there yet but if I can offer one piece of advice, that is: Don't shun the 'endlessly experimenting to find a sound'-thingy, because that's the best way you can grasp the tools. Over time you'll be able to get there faster but it's a necessity..

This is how much I evolved, without even noticing, only making tracks after tracks:

Sep 07 / 2012 http://codegrub.org/flipbit/musicmaking/equal02.mp3 cringe

Mar 25 / 2017 http://codegrub.org/flipbit/tracks/flipbit03%20-%20Twothousa...

cya o/


Picking up an analogue synth with all the knobs on the front is a good way to get your head around sound design, and very quickly discovering what does what (sound-wise). An oscilloscope on the output also allows you to see what is physically happening. VSTs tend to 'get in the way' because of the interface, but obviously you could get something like Diva and experiment in the same way. I think reading up on the physics of oscillators, filters, envelopes etc. can be a real help getting that picture in your mind of how to make the sound you want as well.

I've been building up bit of an epic studio [1] over the past few years after being in-the-box for years. And the hands on nature of real synths is so much more intuitive that VSTs imho.

[1] https://tinyurl.com/kzl97vl


> a bit of an epic studio

sir, you have already reached it: it is fucking epic, wow! Congratulations, it must be really fun being on that room, and it must be difficult getting out of it hehehe.

I want to get more into the hardware side of music making but being cost efficient is paramount to getting up and running in the cheapest way possible, specially (in my case) this is a hobby I consider myself 'just starting out'. If I have some cash to invest in it, I go to what will give me the most return (what will enable me to study the most). In my experience that meant DAW Software (Renoise), MIDI KEYS (Axiom 25), interface (Yamaha AG06) and a pair of monitors (Yamaha HS8's). Now that I've the basic kit 'sorted out' it is time to get some hardware.

What would you suggest? I've been eyeballing a KORG MS-20 mini but I don't know...


> Congratulations, it must be really fun being on that room, and it must be difficult getting out of it hehehe.

Indeed it is!

Monitoring and room acoustics are definitely the very first thing to focus on. It was something I neglected for far too long. If you can't hear what's going on it doesn't matter how much gear you've got.

My favourite hands-on synth is the Roland Juno 106 [1], it's so god damn simple to use, everything is there, and so tweakable. They seem to have gone back up in price, but I picked up a pristine version for £600 off ebay. Obviously you need to be careful with older gear, and definitely try before you buy to make sure the thing isn't falling apart.

For mono synths my favourite is the Moog Sub 37 [2], it's knob central and sounds amazing, as all Moogs do. Although I was considering replacing it with the simpler (but more classic sounding) Model D which has just been re-issued.

The best modern analogue synth I have is the DSI OB-6 [3]. Although we're getting into the expensive end of the market here, I reckon it's a future classic. These things will hold their value very well. It's also got all the knobs and controls you'll need, but with slightly different filters to most other synth manufacturers, which is good for the contrast.

The Korg MS-20 would definitely be a good place to start (I haven't got one myself, but many friends have, and rate them highly), the fact that it has all the knobs on the front for every component of the synth and has the patchbay is perfect for experimentation.

You'll never regret getting an analogue synth, the sound just dwarfs what VSTs do imho. They're _alive_ in a way that you just don't hear from VSTs.

It's also interesting how different analogue compressors and EQs sound compared to VSTs. There's a rawness and sexiness that I have yet to achieve in-the-box (not saying it's impossible, just I'm too lazy to spend ages trying to achieve the sound I can get from hardware by simply switching it on).

> making but being cost efficient is paramount to getting up and running in the cheapest way possible

I have the Chandler Curve Bender EQ [4] which is based on the EMI Abbey Road desk that was used to record Beatles and Pink Floyd albums. It is super expensive (£5000+), but as soon as I heard what it could do I just needed it in my life. I call the on/off switch on the front of it the "it's just better switch" because as soon as I press it the sound in my studio turns 3D and everything is good in the world. I have the plugin version of it (UAD), which is very good, probably the best VST EQ I've heard - but it's not a patch on the gear and doesn't invoke that emotional feeling.

The reason I'm saying this is that yeah this stuff is expensive, some of it super expensive, but if you pick up one piece of gear a year and learn it inside out you'll be in a great place - creating awesome sounds quicker than you ever could before in-the-box. Most people I know with killer studios took a decade to get there.

[1] http://www.vintagesynth.com/roland/juno106.php

[2] https://www.moogmusic.com/products/phattys/sub-37

[3] http://www.soundonsound.com/reviews/dave-smith-instruments-o...

[4] https://www.youtube.com/watch?v=aUv9GtMlUwA


Thank you for your tips, my friend! You are totally right: go slow, pick your gear one at a time and after some time I will have a great little home studio to play with :-)


Are more of your tracks posted online?


Yes, I post all my stuff to Soundcloud and Youtube, this way I can get constructive feedback and learn even more.

Here are the channels you can listen to more of my stuff, and by all means please help me get better by commenting and feedbacking me if you can. If you make music as well I will gladly return your energy and time by commenting and giving feedback. :)

Also, I usually participate on the listen thread/feedback rounds on reddit's /r/edmproduction, you'll find me there as well commenting on everyone's tracks ;)

https://soundcloud.com/flipbit03

https://www.youtube.com/user/cadumimi

cya o/


You're right that it's a staggering amount of arcane knowledge. But starting out I always recommend experimentation over getting too deep into the theory. It does help to have some baseline understanding of:

1. Frequency 2. Harmonics 3. Oscillators/Waveforms 4. Envelopes 5. Filters

The only problem with the last part of your request is that even if you are to watch people design sounds for a couple of hours you might find that when you try to replicate that somewhere else it doesn't sound right. This is partially because every synth/softsynth is different and will produce different sounds and have different parameters. It can be infuriating to get a tutorial on how to produce that perfect "Bladerunner Blues" synth and come out with something that sounds totally flat and bad.

To make matters worse, there are apparently 0 good tutorials on the subject - I just googled for 15 minutes to no avail. The two below cover some of it but I personally can't bear listening to the people who make these videos.

https://www.youtube.com/watch?v=TvQVQuV-Kys https://www.youtube.com/watch?v=lJVlWdzoZ0w


I would even narrow that list down to one: Harmonics. Once it clicked in my mind that every sound is just a combination of sine waves, and that it's the intervals, amplitudes, and dynamics of those sines that make up everything we hear, it made sound design a lot clearer for me.

Of course, finding the right waveforms, filters, and envelopes required to get to a particular pattern of sines is still the challenge, but having that understanding of the medium underlying it all makes experimentation that much more productive (and fun).


Also people who have really hot Sound Tips generally don't want to give them away. If you can make a unique sound with some special trick you will have an advantage over your enemies (other musicians).


One problem is that every machine tends to be designed just a little bit differently. Therefore, tips on exactly recreating the sound might not necessarily translate well from one machine to another.

For instance, the "Blade Runner Blues" patch as I understand it is actually one of the brass presets on the Yamaha CS-80. (Bad recording but here: https://www.firsthomebank.com/personal-banking/deposit-produ...) The CS-80 has a pretty unique architecture for a polyphonic analog. (http://www.cs80.com/tour.html) To get a patch exactly right would require replicating layout, filter architecture and structure, etc.

Knowing basics synthesis, however, can get you pretty close. I have a patch on my Alesis Andromeda (which has some CS-80 type elements such as a ribbon controller, dual resonant filters, and an unfiltered sine that goes to the post-filter mix) that someone did in a user community -- it came out decently good. I was able to Google a book page that gives a good overview of recreating it on other synths. (https://books.google.com/books?id=Jz1JMnZNO88C&pg=PA74&lpg=P...)

Now, to really get the Vangelis Blade Runner type effect, you have to be able to play a synthesizer expressively. This is unfortunately is tougher on most synths compared to the CS-80, due to the CS-80's polyphonic aftertouch that most synths lack. That being said, there are other techniques people could do. I understand that Vangelis used pedals to manipulate filter and volume, and that is something that can be done on many synths that I don't see a lot of people taking advantage of. Don't discount playing technique when it comes to the art of sound design, in other words.


That's an understatement. Vangelis improvises whole soundtracks live, playing fairly simple melodic lines and counterpoints with his hands, but manipulating LOTS of pedals to arrange on the fly. By that, I mean more than ten pedals, arrayed in an enormous bank at his feet. It's staggering, and I can't think of a single other electronic musician with nearly that proficiency at foot-pedals.


I really like that you just went off on a huge tangent about this, no sarcasm. I really agree with your last line too, Kevin Shields is another example of this. By perfecting a unique playing style (holding the pitch bar while strumming) he was able to come up with a sound so unique that it spawned a subgenre


Only if your claim to fame is primary sound transduction and not, say, being a guru of giving other people tools and help with their ideas. My own career over the last ten years or so has been based on the latter.

I will say that I think the 'power-law' nature of that is not dissimilar to being a primary sound transduction artist. You don't get a large number of people being celebrities at tutorials, or of disseminating free plugins.

And yeah, I do mean to expand upon this: got a likely domain for it just yesterday. The trick there is that you need to be inter-disciplinary enough that you can produce a really wide range of content, that by definition a newbie couldn't possibly process. I can go from 'slew rates in op-amps in boutique guitar stompboxes' to 'exploiting unusual interpretations of the Circle of Fifths' (did you know the Four Chord Song can be read as a atomically contained minimum-area space in an extended diagram of the circle of fifths?) but a newbie wouldn't cover that range.

There are no secret weapons, just secret masteries: by that, I mean 'stuff that's sensible and obvious, but to the contextless outsider seems like black magic coming out of nowhere'. Any sufficiently deep context seems like magic to someone who has no idea of the scope of that context.


At least when it comes to synths, check out Syntorial. It's very similar to what you want, though it only covers a certain kind of synthesis.

http://www.syntorial.com


Syntorial is pretty good. If you're looking for something free (and much more basic), I have a quick video series on YouTube: https://www.youtube.com/watch?v=VSwjp7Zt1GY&list=PLKzX4WhkkV... I built it as part of my course I'm working on for Sagefy, so that would include some multiple choice practice questions too: https://sagefy.org/subjects/CgDRJPfzJuTR916HdmosA3A8/landing But there's nothing quite like just using the tools and getting experience directly, for sure.


Seconded. Syntorial is an awesome way to learn to program synthesizer sounds. It plays a sound, then you replicate it with the synth controls. It starts easy and gradually ramps up the difficulty, adding more knobs to twiddle, explaining the concepts as it goes along. It's a million times better than staring at a full synth control board, moving knobs and hoping that you figure it out eventually.


There's no easy way. As you mention, this is arcane knowledge--people really do study it for years or decades to train their ears to the appropriate levels. The two big components are standard music production techniques (reverb, compression, EQ, appropriate mic usage, etc.) and then domain-specific knowledge for the instrument (characteristics of different guitar amps, say, or synth filters).


as other replies allude it's also highly genre/goal dependant.

if you're trying to make your rock band sound more like led zeppelin there is a fairly fixed set of tools and instructions (albeit futile, ultimately)

if you are imagining a pure sound in your head that is not straightforwardly produced by an instrument, then it gets a lot more complicated, and there are countless routes to the same goal. the experimenting is the fun part though!


Post youtube link of sound to genre appropriate producers forum or facebook group.

For the longer route this is a classic http://www.soundonsound.com/techniques/synth-secrets


Oh man that is a rabbit hole you don't want to go down. Modern software synths are extremely complex, but also extremely powerful.

I mean, look at the interface for Serum, probably the best synth on the market right now:

https://d84g6yundlaof.cloudfront.net/assets/serum/serumall-b...

It looks like an airplane's cockpit.

Sound design is a whole another part of music. Most amateur musicians don't even bother with it because it is way to technical to master. They just use presets.

I personally hate it, but if you have a technical bent, you might enjoy it


Serum is the easiest synth I have ever used. There are a lot of controls, but they all make sense. And if one is just starting out, they can ignore most of the controls and focus on just the basics: Waveform, envelop, filter. Then move to modulation starting with LFOs. (Modulation is where Serum really shines. It is literally drag-and-drop. Compared to the modulation matrices that many synths use, it's a cake walk.)

If you think Serum look complex, take a look at Zebra 2.


The people at Ableton are well aware of this (the term "the studio as an instrument" comes up quite often at their event "Loop") but I'm not sure there's a better way to learn besides experimenting. I've heard quite a lot of stories about (electronic) music creators telling about their beginnings, how they would ask for help to more experimented friends, how they would reply "no, just try things", then how they felt in the end it was a good thing once it "clicked."


THIS, this right here. That idea has come to my mind many times, having something like a library of "recipes" for sounds. The hardest part for me whenever I try to do something in FL Studio is getting the source samples and make them sound the way I want individually. It's a shame to have an idea for a song in my head and not being able to materialize it just because I don't know what kind of plugin or instrument I should use.


It's one reason I like monosynths (or or two oscillators). Really helps you understand basic subtractive synthesis. Then you can layer on. A very simple modular system can help too because you have to understand the fundamental of sound and modulation.


Aside from training the ear, you need a pretty sophisticated understanding of how soundwaves translate into sound.


I always wondered why musicians keep up with the conventional musical notation system, and haven't come up with something better (maybe a job for a HNer?).

I mean the conventional music notation represents tones in five lines, each capable of holding a "note" (is that the right word?) on a line, as well as in between lines, possibly pitched down and up, resp., by B's and sharps (depending on the tune etc.).

Since western music has 12 half-tone steps per octave (octave = an interval wherein the frequency is doubled, which is a logarithmic scale so compromises have to made when tuning individual notes across octaves) this gives a basic mismatch between the notation and eg. the conventional use of chords. A consequence is that, for example, with treble clef, you find C' in the top but one position between lines, and thus at a very different place than C (one octave below) visually, which is on, rather than between, an additional line below the bottom-most regular line.

I for one know that my dyslexia when it comes to musical notation (eg. not recognizing notes fast enough to play by the sheet) has kept me from becoming proficient on the piano (well, that, and my lazyness).


> I always wondered why musicians keep up with the conventional musical notation system, and haven't come up with something better (maybe a job for a HNer?).

You're not alone, this is a common reaction to music notation by engineers; a lot of people have wondered the same thing, even here on HN. For example https://news.ycombinator.com/item?id=12528144 https://news.ycombinator.com/item?id=12085844

I see some great responses, but I wanted to add that you have to keep in mind that tons of people have actually tried to make a better system, and nobody has succeeded. That should give you enough pause to ask why and consider the possibility that the system we have is really good in a way that you haven't recognized yet.

I think the problem is that difficult to learn and bad are easily confused. It is difficult to learn.

Also keep in mind that music notation has undergone many iterations, and it represents developments over hundreds and hundreds of years and covers every instrument under the sun - the breadth of what it has done throughout history and what can do might be hard to see.


>I see some great responses, but I wanted to add that you have to keep in mind that tons of people have actually tried to make a better system, and nobody has succeeded. That should give you enough pause to ask why and consider the possibility that the system we have is really good in a way that you haven't recognized yet.

I think that this is the incorrect way of looking at it. I suspect it is less that the traditional notation system is highly evolved and effective, and more that getting a critical mass of musicians to transition/relearn/teach/translate into a newer system is incredibly difficult.

For instance, while Imperial units aren't without some advantage, they are pretty generally inferior to the Metric system. But the US hasn't really switched because it requires a significant level of coordination and control that simply isn't easy to access. And getting musicians to learn and teach a brand new, objectively better system would be much much harder.


The current system is 800 years old, and over that time it has won over hundreds of different systems. A new system is proposed every now and then, and even though they might be better in a specific problem domain (say, microtonal music), but they always fall apart.

I have thought a lot about the problem (worked as a professional bassoon player for a very long time), and I can't say I have had many good ideas. There are some ideas for simplified music notation (with different shapes for flats and sharps) which work _very_ well for making sight reading easier. Until it doesn't: It can't express enharmonics (different ways of writing the same note), which makes tonality analysis harder, and can actually hamper readability since most people that are fluent in reading music usually "stay in key" when reading music.

A quick google gave me this: http://musicnotation.org/ and I can't say I am very impressed by anything I see there. But as you notice, most systems are oriented by lines. I don't think that is because people lack fantasy, but because it is a pretty good way to write music.


What do you think about parallel visualization? Right now, musical notation strives for a single notation that tries to encompass the entire work—and to also serve as a canonical, lossless transcription of the work, from which it can be recovered.

If you drop that requirement (and then assume digital storage) you could have 1. an underlying canonical format that has "all the information" but which is never presented to the performer, nor to the composer; and 2. a number of views that expose various dimensions of the composition. Like orthographic projections of a model in CAD software.

Presuming an interactive display (touchscreen, etc.) you could switch between these views at will; but even for printed sheet music, you could just isolate one measure at a time and then display several "stacked" views of that measure per page.

(Basically, picture widely-spaced, annotated sheet music, but where the annotations are themselves in the form of more musical notation, rather than words, appearing in additional sub-staffs attached to the measure.)


"Right now, musical notation strives for a single notation that tries to encompass the entire work—and to also serve as a canonical, lossless transcription of the work, from which it can be recovered."

I don't believe this to be true. (Modern) Guitar Music is most often written in tab often without accompanying staff notation. Also staff notation is not loseless, musicians will interpret the music differently. For example, with violin, whilst some instruction is given on bowing it is almost never complete and the musicians will find different ways to fit the bowing to the rhythm, this can make a huge different to overall tone as (most simply) the up bow sounds distinctly different to the down bow.


I do think this is the direction it is heading. There are new "smart" music stands coming to the market now with similar features.

Conductors can write notes about certain parts that can be accessed by musicians. Opera musicians (where different people play the same music every night) can have their own personal notes.

Most exciting is ofcourse that everyone has instant score access. That removes a shit-tonne of time wasted during rehearsals.

Those are just traditional use cases. I'm excited to see what will come. I don't know ifusic as it is practicedtoday can be "expanded" in any meaningful way, but that only time will tell


I think this is a great idea as part of a learning tool, being able to simultaneously visualise a musical idea on a score, in guitar TAB, woodwind fingering, piano roll etc.

I've got a plan on the backburner to do something like this using Ohm https://ohmlang.github.io/


You have a great point, that getting the world to switch would be very very hard. But it's not black and white, you can't compare that to anything.

If there's a viable alternative to music notation that you know of and is superior to what we know as standard western notation, feel free to share.

Your choice of example is interesting, considering metric has won, and the US is switching slowly.

But there is no incorrect way of looking at it, music is an art. Standard notation is highly evolved and effective, it has been iterated on for millennia. Getting a critical mass of musicians to learn a newer system would be incredibly difficult. Both are true, and you can't compare them and say that one is "more", that's flatly not true in any meaningful sense.


I hope people don't think I'm being brusque here, but these comments are a classic case of an outsider looking at the system, admitting to be lazy and wondering why the rest of the world differs from their expectations vs. asking musicians what they think.

At its core, musical notation is succinct: a mixture of logic and unique symbols. Note markers are isomorphic to pitch. Rhythms subdivide with vertical lines. Special symbols and brief phrases denote beginnings, ends and loop points. (They're not usually in English) Geometric figures indicate volume and speed changes.

A competing system in my purview is "tracker" notation. It's vertical and generally only used on machines, but hand writable: It looks like: C-3 Eb3 G-3 Bb3


I have the same feeling. Music notation might be hard to interpret sometimes, but none of the alternatives actually solve anything. They do however introduce a whole lot of questions.

I think a valid comparison is the regular alphabet. It is, after all, a coding system for language in the same way that notation is a coding system for music. Most of the problems of that coding system (my pet peeve is english spelling) generally stem from conventions rather than problems with the alphabet (italian and german is much easier to spell correctly).

There might be some interesting alternatives (hangul!), but those systems come with their own share of problems and generally have no big benefits. I actually believe that musical notation is better fit for it's task than our current coding system for language.


> the US is switching slowly.

As a US citizen who is a metric fan and loves using it, in what way? The government did switch - in the 70s - according to its own statutes, it had to.

It has crept up in various places (and I find it hilarious) innoculously, like in 2 liters of soda, or in how computer processors are talked about in mm² die areas.

But the average American still uses imperial units religiously, anywhere they approach a problem involving any unit of measurement they always default to imperial, and having a 14 year old brother I see no change in his education or habits to indicate a slow transition of mindshare. The government moved decades ago, but the people aren't moving at all.

I get the impression it is much like high school language classes - you learn it once early on, never practice it, and by the time you are a full adult you have completely forgotten it. I'm not sure how to improve the situation to actually get the people to start using international standards, because if you were to start trying to force it on the supply side people would just not buy metric tools and information because they forgot it back in primary school.


I think the way to switch would be done the same way other countries have done so:

1. Make sure everyone is educated in metric

2. Change the easy things: the paper size the government uses, the units on food labels, the measures legal to use for sales of loose food or other goods, the units the government uses for all types of reporting. (Therefore if businesses want government contracts, they'll need to use metric.)

3. Change other standards, like residential construction, preferred fasteners, wire sizes. Where old measures are required for compatibility, write "24.5mm" in the standard. If the dimension could be changed to 25mm without any side effect, use that.

4. Change other things people see daily: I don't know if doctors use metric in the US, but I assume they communicate to patients in old units. Change the default, but accomodate older people. Change the road signs. Is anything left?

The UK is part way through 4, but has been stuck there for decades.


Musicians frequently get taught music in large batches at schools, though, which means you don't have to worry about network effects—there are choke-points in the network.

There's no reason a given school couldn't teach a "colloquial notation" first, with the "Lingua Franca" musical notation taught later on, for everyone in that given school. Then everyone who comes from that school would know that colloquial notation.

Consider: the "Chicago school" of Economics; "Rugby School" football; etc. These things start as colloquialisms, then spread to global awareness.


England had a colloquial notation, taught in schools, for several decades: tonic sol-fa. But it could only talk about melody and rhythm, not harmony. It fell out of mainstream use in the late 1960s, perhaps as the music publishing industry consolidated and globalized, making it easier to have a single international edition of each song instead of separate editions by country.


Music notation is to music as qwerty is to keyboards?


No, qwerty for keyboards is more like the lay out of the 12 notes on an instrument, and there are many instruments.

Music notation is more like a programming language. The score is like a program that you can read/interpret and play.


For instance, while Imperial units aren't without some advantage, they are pretty generally inferior to the Metric system.

You say this pretty matter of factly, but I actually vehemently disagree. Many imperial measurements are better than their metric counterparts for day-to-day lay usage.

- Fahrenheit is a better scale than Celsius - Inches, Feet, & Miles are very practical units. Centimeters, and Meters much less so. - Pounds are smaller and offer better delineation than Kilograms. - Liters are pretty similar to quarts, though I admit the various Imperial sub-units are annoying.

Sure, it's easier to convert between metric scales, but the number of times I actually do that?: approximately zero.


“In metric, one milliliter of water occupies one cubic centimeter, weighs one gram, and requires one calorie1 of energy to heat up by one degree centigrade—which is 1 percent of the difference between its freezing point and its boiling point. An amount of hydrogen weighing the same amount has exactly one mole of atoms in it. Whereas in the American system, the answer to ‘How much energy does it take to boil a room-temperature gallon of water?’ is ‘Go fuck yourself,’ because you can’t directly relate any of those quantities.” Wild Thing by Josh Bazell.


That's great.

When is that EVER useful to the layperson?


Cooking


You don't really give any reason why Fahrenheit is a better scale or why inches, or why feet and miles are particularly practical units. It seems to me that people say this simply because they are used to them. You don't convert to metric units and it feels awkward because you don't use metric units.

There is an issue with "kilometer" being a complex word for everyday use (as compared to a mile) in the English language. That's more a linguistic issue that about the unit itself. Other languages have solutions to that with shorter colloquial name for the unit.

Of course the imperial units give a good opportunity for being funny, in ways like specifying speeds in furlongs per fortnight. But you can do the same in SI-derived units, like parsecs per picosecond.


> There is an issue with "kilometer" being a complex word for everyday use (as compared to a mile) in the English language. That's more a linguistic issue that about the unit itself. Other languages have solutions to that with shorter colloquial name for the unit.

Even in English people of a certain age can say "klicks" and be understood.


Exactly, it's something that mass usage will solve, even if the folk song will not sound just the same with "a hundred klicks, a hundred klicks, I am five hundred klicks away from home".

Other languages often say just letters "k" or "km".


Inches, Feet, & Miles are very practical units. Centimeters, and Meters much less so.

Really? Do you know how much easier it is to compute surfaces and volumes in metric systems compared to imperial? Concrete example. Figure how much soil you need to buy to fill a box knowing L, W and H. In metric it is a 10s process. In imperial i do not even know how you are supposed to do it. Does anybody even know how many quart are in a cubic foot?


> Does anybody even know how many quart are in a cubic foot?

They know it, after they do the conversion, via metric system.

(OK, nowadays you can just enter "1 quart to cubic feet" in Google. And the funnier ones you get at https://en.wikipedia.org/wiki/List_of_humorous_units_of_meas... )


No, I don't know the number of quarts in a cubic foot, but no one does because they're two different measurements for two completely different uses.


No wonder that you miss the point of metric units if you don't get why doing such transformations is useful.

In the metric system, converting between length, volume and weight is trivial and straightforward. This comes into play neatly whenever you need to pile up a precise amount of batter or liquid from containers measured with a different unit.


Another way to look at it is that the current notation system isn't the best overall, it's just the most tolerable trade-off between a bunch of mutually-incompatible requirements.

Replacing standard notation for all uses may be doomed to failure, but replacing standard notation for some particular use case (especially new use cases that weren't anticipated when standard notation settled into its current form) may be a very useful thing to do.

Computers also give us a few new options, such as displaying notation in a time-varying form, or using three dimensions, or notating the music in some universal language that isn't necessarily easy to read but that can be easily rendered in any desired notation.

Lattice notation for instance is something I really like, but I don't know how to represent it without some kind of animation.

Here's an example I stumbled across on Youtube awhile back of the kind of thing I mean: https://www.youtube.com/watch?v=jA1C9VFqJKo

Lattices generalize to higher dimensions, which means they might be amenable to virtual reality or even some sort of human-brain interface that allows you to experience 4 or 5 spacial dimensions at the same time.


> Another way to look at it is that the current notation system isn't the best overall, it's just the most tolerable trade-off between a bunch of mutually-incompatible requirements.

Isn't most tolerable trade-off between multually-incompatible requirements another way of saying "best overall"?

Totally agreed there are useful local overrides of standard notation. Tablature is one example, and there are others. I wouldn't call those replacements for standard notation though. Both notations exist, both serve different purposes, neither is going away, there's no either-or question to be resolved.

The lattice videos are super interesting! Thanks for sharing that. I want to watch a few more and understand his layout choices -- I think I kinda get it, triads form triangles. These don't encode anything temporal though, so this is a visualization that helps understand harmony spatially, but is not a musical notation and can't encode a song, right?


> Isn't most tolerable trade-off between multually-incompatible requirements another way of saying "best overall"?

I could have said that better. What I meant was that standard notation isn't better than every other system according to every metric we could use to compare such things.

Gary Garrett has more lattice demos on Youtube. Here's one that's an animation of an example in Harmonic Experience by W. A. Mathieu (which uses lattices extensively to explain harmony and is the best reference I know of for explaining how to understand them): https://www.youtube.com/watch?v=I49bj-X7fH0

A 3-5 lattice is a grid where one axis is fifths (powers of 3 in just intonation) and another axis is major thirds (powers of 5). Garrett implies a third axis for septimal flatted seventh (i.e. barbershop 7th) intervals. Since the grid is leaning to the right, the diagonals that lean the left are minor thirds. Powers of 2 (octaves) are usually ignored. Triangles that are flat on the bottom are major triads. Triangles that are flat on top are minor triads.

There isn't an obvious way to encode a whole song onto a single lattice diagram in a way that could be printed on a page and still be readable. They seem to work pretty well as animations or as static illustrations to explain chord transitions, though.


> What I meant was that standard notation isn't better than every other system according to every metric we could use to compare such things.

This is totally true; tablature is better for beginning guitar players to learn to play specific songs on the guitar.

The only reason tablature doesn't supplant standard notation is that the metric under which it's superior is much narrower -- it's only for guitars, and only better than standard notation for beginners.

I don't think standard notation is necessarily the best overall, by I do think it happens to be the best overall, the best we've got today. And I'm not convinced it will ever become a choice, as opposed to standard notation evolving like it has in the past to incorporate new ideas.

Thanks for the explanation of the lattic layouts; I hadn't noticed the triangle orientation part, I only got as far as seeing that horizontal lines formed the circle of fifths. I can't tell what the plus and minus symbols mean, do you know? Usually those are used for diminished and augmented chords and not single notes, so is Bb- another name for A that is useful under the lattice system?


It's a way to identify distinct pitches that are usually treated as the same in equal temperament.

For instance, in just intonation 2 (the major second of the scale) has a frequency that makes a ratio of 9/8 relative to the tonic, but sometimes you might want a slightly flatter major second with a ratio of 10/9. So, that note is label 2- to distinguish it from the regular major second.


Maybe no one has succeeded with a general replacement, but there are different notations for guitar. I assume some other instruments have their own notation too. When electric music kicked off, to reproduce sound you have to trade setups / circuit diagrams, old music notation can't encode that! I kind of think of it like x86 assembly. It's here to stay, for better or worse, but that doesn't mean you can't have nicer things on top, and there are still things that don't make any sense at all in the x86 world (like FPGAs for one).


Tablature has a long history as well, it didn't start with guitar. Before guitar there was lute and cittern tablature -- which typically use letters and not numbers. I play both guitar and lute and I actually wish the letters convention had stuck, it's more fun. Wikipedia says that the first known tablature was for an organ. https://en.wikipedia.org/wiki/Tablature#Origin

Yes, some other instruments have their own specific notations & tablatures as well. These aren't replacements for standard notation though, and never will be. They have a place, and they are useful, but they aren't in competition with standard notation. Tablature has its disadvantages (https://en.wikipedia.org/wiki/Tablature#Disadvantages) but also the single biggest reason for standard notation -- groups, band, ensemble & orchestral playing -- is something tablature can't help with at all.

Totally agreed that standard notation doesn't help with electronic sound reproduction, but I'd suggest that standard notation isn't for sound reproduction in the analog world either, that's not it's purpose. Standard notation is the sequencer, not the synthesizer. You can use standard notation to encode songs in the electronic music world, but it's definitely not super convenient, hardly anyone does that. The analog version of trading setups and circuit diagrams is carving your violin using plans and specifications of a Stradivarius violin.


I have a theory for this. Please, do not down vote me, I am here with limited English but really good intentions.

QWERTY keyboard is something humanity found a better solution, people have developed better layouts like Dvorak, per example, and world keeps using QWERTY (not my case).

I have studied long time ago that TCP protocol is also not the best protocol, there are much betters and faster, but people keeps using the old TCP for Internet...

I believe when something is already consolidated, it's expensive to change, sometimes it's not worthwhile do update all the consolidated knowledge/investment, even when having better solutions.

World updates consolidated solutions just when the gain really worth it, it's not the case for music notation.

I also agree with you, the music notation could be easier, but I believe they don't upgrade because the masters musicians have mastered it, so they like the actual notation, and they are the fellows with enough knowledge to create a better version. I believe there is others types of notation, but it would need to be used by the masters musicians, and music schools, and universities to start a wave that could replace the actual notation (that already works pretty well).


The argument that Dvorak is superior and that inertia is keeping people from converting has been studied, and while I think there's some element of truth, it doesn't seem particularly compelling, since big disruptive changes occur all the time.

"the best-documented experiments, as well as recent ergonomic studies, suggest little or no advantage for the Dvorak keyboard."

https://en.wikipedia.org/wiki/Dvorak_Simplified_Keyboard#Con...

"The trap constituted by an obsolete standard may be quite fragile. Because real-world situations present opportunities for agents to profit from changing to a superior standard, we cannot simply rely on an abstract model to conclude that an inferior standard has persisted. Such a claim demands empirical examination."

http://www.utdallas.edu/~liebowit/keys1.html

Musical notation is a vastly more complex system than keyboard layout, and I don't believe we have a Dvorak of music notation to even compare with. There are no contenders for musical notation that a large group of people believe are superior. So there's no reason to believe that inertia is keeping people from using another notation.

To go one step further, music notation is constantly changing, it has been evolving, adopting and incorporating the best ideas for thousands of years. What reason is there to not start with the assumption that it already took the best changes so far? I have no doubt that if superior ideas for notation develop in the next hundred years, that at the end of it, we'll still call the result 'standard music notation'.


My intention was not to compare music notation and dvorak, but write about human behavior in similar situation against the "inertia" you cited.


Totally, I understand. And mine wasn't to counter Dvorak specifically, but mention that the inertia theory has been questioned, and also mention that sometimes things are believed to be better by some people but in reality aren't much better if at all for most people. Sometimes inertia is posed as a reason for not changing when in fact the reason is the accepted system is the superior system for the largest number of people.

The latter is my theory about music notation; that inertia is not even at issue yet because there are no serious alternatives.

And inertia might never be an issue, because music notation is a fluidly changing system. TCP and qwerty/Dvorak are static systems that don't ever change, so you can argue about which one's better. Music notation is changing and improving, so it's hard to suggest that people are resisting change, and hard to suggest that something better will supplant it, right?

I agree with your theory in general though, outside of the issue of music notation, and I think a lot of people do. It's just a matter of finding the right examples that clearly demonstrate it. And it would be really interesting to somehow quantify the amount that something needs to be better before people will adopt it. It's like static friction in physics -- it takes more force to get something started moving than it does to keep it moving.


Now I got your point! Thank you.


That's something I've done time and time again, and seen others do too. It's easy to look at something and think you understand it well enough to know how it can be improved. But when you find out the rationale and reasons it is the way it is, it's kind of humbling. Like how it surprised me to learn that there's a lot of valid, practical reasons to use the Imperial measurement system over Metric.


> Like how it surprised me to learn that there's a lot of valid, practical reasons to use the Imperial measurement system over Metric.

I've always wondered about that. Why?


I don't remember where I read it, but one of the big reasons was that Imperial units are much easier to divide in ways that make a lot of sense in practical usage, whereas Metric is designed to make conversions easier for doing science, which puts practical usage on a lower priority. But take this with a grain of salt.


* 64.7989 mg of salt


Machinists and engineers often prefer Imperial "mils" (thousandths of an inch), for instance. It's easy to convert from kilograms to pounds or kilometers to miles, but there's no convenient metric unit for expressing typical distances and tolerances used in mainstream machining. A millimeter is way too coarse, a micrometer is way too fine.

As a specific case, in electrical work, it's easy for me to specify "6 mil trace/space" attributes for a PC board design. Not so easy to say "0.1524 mm" or "152.4 microns." If I round my specification down to 0.1 mm, the resulting copper features will carry less current and cost more money. If I round it up to 0.2 mm, other physical and/or electrical requirements won't be met. So now I have to add at least one more sig fig, which is a pain in the neck for no obvious benefit.


Also, what we have got this way through quite a bit of evolution...

The first thing that looks a bit like modern notation is probably plainchant, originating in the catholic church circa 14th centure:

https://i.ytimg.com/vi/4qKRGsJMMG8/hqdefault.jpg

The basic system we use today originates from about the 1600's or so, but has still evolved a lot.

There were tons of historical warts along the way that have largely dropped off - for instance, figured bass notation (https://en.wikipedia.org/wiki/Figured_bass) or the French violin clef (https://upload.wikimedia.org/wikipedia/commons/thumb/c/c3/Fr...)


See also: https://news.ycombinator.com/item?id=12159224 . Which I shamelessly plug since I was a participant in that one. :-)

I got a reply there that the current system is only suitable for professional musicians, and that you'd need something like shape notes to reach mass musical literacy. Now I'm hopelessly biased as a music degree-holder, a semi-professional musician, and a Presbyterian to boot ;-), but this strikes me as setting the bar way too low. Given levels of overall literacy in the US (which were very different when shape notes were developed) I don't think it's that difficult to learn the notation itself – the difficulty I think is in mastering the music system.


Think of it as data compression that shows you the notes you're most likely to play, without taking up space for notes you probably won't.

If there's a piece in C, for example, in most traditional Western music you're unlikely to play off-key notes. So why take up valuable space for those when you can denote that unlikely event with a sharp/flat symbol?

Traditional music notation made no sense at all to me until I realized this.

Edit: For those that don't know, in most western music you're only going to use 8 out of the 12 possible notes most of the time. This is not universally true especially of modern non-pop music, but traditionally if you played off-key notes people thought you might summon evil spirits so it's easy to understand why things would be written down this way. Not only is it space efficient, but you wouldn't accidentally summon the devil. To summon the devil you have to really want to and write a flat or sharp in there.


> Edit: For those that don't know, in most western music you're only going to use 8 out of the 12 possible notes most of the time.

You mean 7 notes. Traditional music notation and terminology is confused in many ways, one of which is a fencepost-counting error. As a result: octaves are actually seven notes apart in a scale, two major seconds make a major third (2+2=3?), and two octaves make a fifteenth (8+8=15!?).


This is a really great answer! This is also a big part of why different instruments read different clefs.


>Traditional music notation made no sense at all to me until I realized this.

Can't thank you enough because your characterization is the first one I've heard that adequately explains the foundation of the visual system as one of condensation. I've been shrugging my shoulders about this for many decades!


Agreed, but reading sheet music is a very small part of playing piano proficiently (say, at the 97th percentile). Once you get past knowing the notes in a piece, there's the much more difficult task of being able to manipulate the force that you exert from your fingers to create the right volume balance. For example, your untrained thumb will naturally play notes much louder than it should, and it takes a lot of practice to be able to play notes with it at the right volume; the opposite is true with your pinky and ring fingers.

...Not to mention the even more difficult task of knowing what you want the piece to sound like in the first place. A novice playing a piece at 100% accuracy sounds nothing like a concert pianist playing the piece. There's a world of depth to music beyond just learning the right notes.

Here's an example: listen to this performance of Debussy's "Reflets dans l'eau" by Arturo Michelangeli, one of the greatest pianists of the 20th century:

https://www.youtube.com/watch?v=LLbpQl1cCl8

And then listen to this student play it (she is still a high-skill player, just not world-class talent):

https://youtu.be/l2gJSVOdaG8


I don't know a whole lot about concert piano, but I don't think you could have picked two better videos to illustrate your point. That student is obviously very practiced and skilled but there's just no comparison.


Thanks! When I learned this piece, I listened to that recording on repeat. Michaelangeli is simply amazing.


The student asks, "how?". The master asks, "why?". That's why you feel a difference in the two performances.


> I always wondered why musicians keep up with the > conventional musical notation system, and haven't come > up with something better (maybe a job for a HNer?).

Before starting down that path, I would recommend familiarizing yourself with the wide range of music notations that already exist and continue to be used, and then the ridiculously varying plethora of failed alternative music notations that have been invented over the centuries, and why they failed to see wider adoption.

And, of course, it's fascinating to study the evolution of the existing "standard" music notation, and see the changes that have been adopted, and the ones that weren't. For all its apparent stasis, it has definitely evolved over the centuries, in response to the changing needs of musicians.


Agree 100% with all this. Modern drum notation is probably the easiest case to look at with regards to evolution of "western notation", with jazz chord symbols being another.

Some other reasons why musical notation prevails:

- There's a huge switching cost, as much of the world's written music is in some form of "western notation". Being able to read standard notation unlocks a huge wealth of knowelege from books, etc.

- Standard notation is one of the most flexible ways to create readable music, playable and easy to read across a wide variety of instruments and ranges (clefs, transposing score, etc).

- It's a common language, in the way that a programming language is. Some of the conventions may be confusing to outsides (i.e. why is the term "puts" used for printing in ruby? this seems normal to any ruby hacker but is completely unintuitive to a layperson). Once these conventions are learned, they provide common reference point. Like a lot of languages, it's far from perfect, but much like spoken language, more likely to evolve than be replaced.

- There's almost no motivation for anyone to replace standard notation. Notation isn't required for all forms of music (many great jazz and blues musicians don't read music), and for the forms of music where it is required, it's by far the quickest and most efficient way to communicate the information.

In summary, I think the question of "why can't we do better" is valid, but you could ask the same question about programming in C. There are good reasons to write C in 2017, and there are still good reasons to write musical notation.


What great jazz musicians don't read music?


Wes Montgomery, Erroll Garner, Django Reinhart and obviously Roland Kirk are probably the most well known that couldn't read at all. There are many, many more jazz musicians that were/are very poor sight readers.


Sure, but those guys are all (sadly) long gone, and the parent comment said "don't", not "didn't".


Bireli Lagrene, Scott Hamilton and George Benson have all said they don't really read music.

It's definitely true that most jazz musicians can read passably, but my original point was that it's an aural tradition. no one learns to play jazz reading notes off a page. Whereas in western classical music, it's an essential skill.


agree that probably all currently popular jazz musicians read music, but is this necessarily an improvement?

https://www.amazon.com/Erroll-Garner-One-Hear-Read/dp/B00AZ4...


That doesn't jive with the "I can do better" mantra around these parts. I say that both sincerely and sarcastically.


Jibe means to be in accordance. Jive is a dialect and a dance. FYI. I realize we will probably lose these words, but for now.....


I get your point, but at the same time I would say that it's often much harder to "do better" when you don't even have a clear idea what it is you're trying to improve upon.


They are not saying you cannot be better. They're just saying to respect what came before and learn from it as you devise the better scheme.



Closer to this: https://xkcd.com/1831/


I have played piano and guitar (piano for almost my entire life and guitar for several years) and have used both tabular sheet music and traditional sheet music.

Tabular sheet music is much easier to read initially as it provides a one to one mapping between the visual representation and the physical location of the notes - i.e. 5 frets along on this string. However, from my experience there is a cap on the 'bandwidth' at which you can sight read this. It is just too hard to mentally parse a bunch of numbers on lines and turn that into notes when playing at speed. (For non musicians, 'sight reading' means to read the notes and play fluently at the same time)

Traditional sheet music has a steeper learning curve, however, I've found that reading this music becomes much more subconscious with practice and the bandwidth at which you can parse the notes is much higher. Also, it is much easier to notice patterns in sheet music - i.e. a major 7th chord in the key of the song is visually obvious no matter what the key.


Great point.

To a first approximation:

Tablature is a _physical description_ of how a particular stringed instrument should be played, and the notes are a side effect of that. It is instrument specific and it doesn't contain much information about the musical details of the piece.

For example, tablature doesn't describe the key the piece is to be played in. To figure that out, you have to mentally translate the mechanical description into notes, and from there determine the key.

Standard notation is a _musical description_ of how a particular song should be played, and the physical act of playing is a side effect of that. It is not instrument specific, and it contains a lot of information about the musical details of the piece, but usually no information at all about how a the instrument should be played. (There are a few minor exceptions.)

For example, standard notation tells you exactly the key the piece is in, but the player has to mentally translate the notes into the physical steps of getting that note out of the piece.

Basically standard notation adds a layer of indirection from the music to the mechanical act of playing. Like many indirections, it can be hard to understand at first, but has adds great power and flexibility that a direct system doesn't have.


What you're saying makes sense, but it applies oppositely too in that tab is non-physical and notation very physical. Example, if you see a scale in musical notation, it's immediately obvious that it's a scale just from a 50 millisecond glance, whereas in tab it's not obvious that it's a scale until you read/play through it.

When you become adept with musical notation, this is one of the primary hindrances of tab.


Tablature is also needed because the same note can be played on different strings, notes can be doubled, there are different ways to transition from one way to another, and there are various other nuances that are messy at best to try to express in standard notation.


another feature/side effect is that you can use any instrument to play any part of a written piece, as long as it's physically possible to play the notes as written. and if not - you can improvise easily by dropping superfluous or unnecessary components/notes without changing the overall sound of the music.


Another side effect is that as a guitar player you can look at the tuba player's part to figure out what he is playing even if you have no idea how to play tuba, if you read music. With tablature only a guitar player will know what you are playing.


> However, from my experience there is a cap on the 'bandwidth' at which you can sight read this. It is just too hard to mentally parse a bunch of numbers on lines and turn that into notes when playing at speed. (For non musicians, 'sight reading' means to read the notes and play fluently at the same time)

I've noticed this as well and my team has developed a notation based of key/scale and a new user interface for the guitar so that experienced players and beginners can sight read on their first attempt at a new song.

We reduced the cognitive load of sight reading music, not only that, we then back fill technique like chord fingering where we introduce traditional chords one at a time, here is a series of three videos of what I'm talking about: https://www.youtube.com/watch?v=KXpTGIzBONU&list=PLvoNIaPTga...


> However, from my experience there is a cap on the 'bandwidth' at which you can sight read this. It is just too hard to mentally parse a bunch of numbers on lines and turn that into notes when playing at speed. (For non musicians, 'sight reading' means to read the notes and play fluently at the same time)

Sorry but this is wrong IMO. You've been reading sheet music your entire life, but you've only been reading tab for the past few years.

I've been reading tab for 10 years. I think in tab. There are a bunch of songs that I can't be bothered learning (sultans of swing, metallica songs+solos, oasis songs.. you get the idea) because I don't like them enough but are fun to play along with, and I do so with Guitar Pro playing the tab at full speed. It's basically like rocksmith/guitar hero but in "real life" mode.


I started with tab and learned to read music 15 years later. I was amazed at the vast amount of information in sheet music.

Tab is great for messing around, beginners or simple songs. I can't even imagine trying to learn to play complex jazz or classical music using tab. Sheet music also guides you right into learning scales and intervals.

Tab is great for playing guitar hero but, even on a real guitar, it's like pressing buttons. It doesn't help you learn much at all. I'll never go back to using tab even though I can visualize it easily in my head.


I wonder if you had been a guitar + tablature player your entire life and picked up piano a few years ago, if you would come to the same conclusion.

I've tried learning guitar a few times and when I've asked accomplished players how they get by with tabs, it's been explained as tab music establishes a minimal framework that you play within. It's a lossy compression scheme (and traditional sheet music is less lossy). Would you agree with that?


>I wonder if you had been a guitar + tablature player your entire life and picked up piano a few years ago, if you would come to the same conclusion.

I was a tab-reading guitar player for years, then learned classical notation. Classical notation is undoubtedly faster to parse. For whatever reason, there seems to be a much more direct connection between your eyes and hands when you're reading dots.

It seems to be much more amenable to chunking[1] - you stop seeing individual notes and start seeing chords and scale fragments. Tab is a meaningful and direct representation of the physical parameters of the guitar fretboard, which I think is a shortcoming; classical notation represents information in a way that more directly corresponds with musical theory.

Tab is lossy, but it discards some very important information. Unlike classical notation, it has no native means of indicating note length and can't accurately represent rhythmic subdivisions. If a piece of music has any real rhythmic complexity, tab alone is insufficient.

[1]https://en.wikipedia.org/wiki/Chunking_(psychology)


I think most guitarists (including me) use tabulature as a loose framework to extemporize around rather than as an exact transcription. You can find lots of youtube videos of people playing exact versions of old favorites (stairway to heaven for example) but they are usually the musical equivalent of painting by numbers, lacking feel.

Jazz musicians typically learn the changes (chords and melody line) to tunes and improvise around that from a sophisticated understanding of harmony, a variation on the tab approach.

Sight reading music, especially for guitarists, is more akin to tightrope walking in my opinion but typically a combination of tablature, staves and chord changes gets me to where I need to be


Totally disagree.

Tab is LESS lossy than traditional sheet music because it encodes the string as well as the pitch.

A given note could be played in as many as 5 different places, and they will ALL sound different. An open A (5th string) will sound different than the same A played on the low E string, 5th fret.

(This is completely unrelated to the woeful quality of most of the tab floating around on the net. You can write down a piss-poor transcription as sheet music too.)


Tab is terrible at conveying rythmic information, playing anything moderately complex is very hard unless you're already familiar with the material. And I'd say it's a very lossy format if it's reliant on out of band information like a recording to make sense.

Fingering is a problem that mostly goes away as you gain an innate sense of what sounds good versus economy of movement and the ability to mute. For music written on guitar, it's usually relatively easy to tell what position works best.

Every guitar is different, too. String gauges, pickups, resonant notes, action height and intonation all play into it, and most of those are subject to personal preferences.


> Tab is LESS lossy than traditional sheet music because it encodes the string as well as the pitch.

But it doesn't encode the note type, right? All the tab books I've bought don't differentiate between whole notes, quarter notes, etc... So that seems pretty lossy. Look at any guitar fake book for an example.

Plus, I never looked at tablature as a literal transcription. That's why I would describe it as a more of a framework. Like you say, a note can be played in a lot of different places. Once you internalize the fretboard logic, when you see an A in the tab, you play the one you think will sound right or is physically accessible.


A lot of the nicer tablature is in a hybrid format that borrows symbols from standard notation, like attaching stems and flags and dots to notes as appropriate to make the rhythm explicit.


Voicings can be up to the conductor, the lossyness is a feature not a bug.


Music notation does take time to learn how to read well at, but it's no different anything else that takes time to learn and master.

Once you get past a cursory "eff this" reaction, you start to see how downright brilliant notation is.

The vast majority of music focuses on 7 notes at a time. If you alter a key signature, you are playing 7 other (non-distinct) notes. Music notation encapsulates this concept very well.

That's only one example, but telling musicians their notation sucks and needs to be fixed because it's hard for a non-musician is akin to a musician telling a programmer that Python and Linux needs to be fixed because it doesn't look like a violin.


> Once you get past a cursory "eff this" reaction, you start to see how downright brilliant notation is.

In this it reminds me of vim.


Oven the last 800 years, hundreds of different systems have been proposed to the system that has evolved to be the one in use today.

Generally it can be said that some have been better in a specific use case (klavar notation was pretty big in the Netherlands among those who didn't know regular notation), but they fall apart pretty quickly when you try to write Liszt or Rachmaninov in it.

I might be a bit rigid (I have played bassoon professionally for most of my adult life), but I can't really see how it can be made much better and still keep the same utility.

While chords might be not optimal today, we can still express things like enharmonics easily (which, at least for me, is something that can make sight reading easier as it allows for the notes to stay "in key").

As with the spoken word, music has an advanced coding system. Both coding systems are flawed in their own way (as someone with a different mother tongue than english, I have a hard time spelling just about anything), but they have also stood the trial of time.


Well, considering this article on Ableton never even uses conventional musical notation and many working musicians sit in front of their DAW all day, I think it's safe to say that the virtual piano roll has largely taken over the roll of classic notation. I don't think too many people making rock music, EDM, or hip-hop has really touched classic notation ever.


I'm a classically trained pianist, and I basically agree with you. I read sheet music because that's what there is. As a young composer, I wrote in standard notation because I hadn't questioned it. As an old composer, I don't think there's anything great about it, and I have no need of it. I 'write' all my music on hardware or software synths. It's much easier. All I care about are the parameters. What note, when, how long, how loud, etc.


It's clear that current music software is poor for conveying information when compared to editors for many other tasks. It is easy to blame musical notation for that, but in fact in most music software there are several equivalent views (tracks, piano roll, notation), and you'll find that notation is the _most_ efficient of those.

Consider this: In this system, your most complex Classical scores for an entire orchestra are written, and present day trained composers continue to work efficiently in it. That tells you about its expressive power. It is in fact not stupid, but very well tuned to a lot of music theory. Other than complex timbre manipulation (and even that), you can do probably everything you want to accomplish with just software that does nothing but notation.

Instead, what most music software lacks is in the organization department. The organization of non-linear ideas, their programmatic (as in music) occurrence, the automation of repetitive tasks, and the completion of obvious intent. Tracks and loops are probably not the right view of musical structure, at least far from a _complete_ view. There needs to be a better bridge between musical phrases and ideas at the local level (for which musical notation is perfectly suited) and the organizational structure of a complex piece at the macro level (for which tools are very lacking). There also needs to be a better bridge between some conception of events (for which musical notation is slightly ill suited, being restricted to notes) and the microscopic world of timbres, effects, and transformations.

Until music software makers recognize that what they should be helping with is neither engraving, nor mixing console simulation, but a non-linear creative task, music software will continue to suck.


Speaking as a classical pianist, I think the conventional musical notation system is actually pretty good. My only issue is having to memorize Italian, French, and German phrases to be able to read music properly. IMHO, music notation should be localized.


Many non classical musicians use other notations. For example, many guitarists and bassists use tab notation. It's simply a visual representation of the strings and a number for which fret to play.

It's not as expressive but is far easier to get started with.


MOST guitar players I know use tabs. Personally, I'd rather see the chord type and root string. Ie. 6th string AMin, 4th string GMaj7. Tabs are almost as confusing because I memorized the notes, not the fret number.


I memorize chord progressions in terms of interval-number. E.g. {F# m; B dom7; E Maj} becomes {ii; V7; I}. I've memorized which intervals are major chords and which intervals are minor chords. So if I can figure out the tonic, then I can figure out the key. And if I figure out the key, I need only remember a pattern of ordinals (modulo non-triads).


This is a great illustration of how we grapple with the abstraction of scales and key over time.

First, you have tabs, which describe the physical position of the notes on the instrument.

Then, we have root / chord type notation, in which we describe the starting position and shape of the notes on the instrument, and the musician must translate that information to the physical position of the notes, on the fly.

What is important about this second stage is that the musician has a pretty good grasp on how to play, and can usually sight read a piece and get a pretty decent version of it just by tracking chords, or in the case of the piano, just chords and the melody on the other hand, or a small pattern.

Finally, we come to roman numeral notation, which describes the chords based on their relative position to the root note of key, not the chord. This is a powerful abstraction. It provides incredible insight into the relationships between music, notes, chords, and progressions of chords at a level divorced from the 'root' of that key. A 9th played over a minor 7th chord is going to give you a very similar sound in any key. This is a great skill for songwriters and composers, who need to have a strong working intuition about things like what chord will sound good in this progression, or what notes we want to appear in our melody (which is related to the chords beneath it).


Yes, thank you. This is particularly frustrating when you play with an alternate tuning such as DADGAD. Tabs are pretty much useless then.


"It's not as expressive"

Have you ever used Guitar Pro or Tux Guitar? It can be INSANELY expressive. Grab a MIDI of Van Halen's "Jump" (IIRC The best one was about 76kB) and import it into either of those. Guitar Pro will be noticeably more expressive vs TuxGuitar. Inside of that MIDI, the solo is 100% dead-on note-for-harmonic-for-slide-for-hammer. Both programs output the exact same tablature. You will get the solo perfect.

Most people that have read tablature haven't read the guitar-specialized notation found in Guitar Pro or TuxGuitar. It's far more instructive.


MIDI isn't tab though. It requires note velocities and durations for a start, which tab doesn't. You can go from MIDI to tab but you couldn't go from tab to MIDI.


"MIDI isn't tab though. It requires note velocities and durations for a start, which tab doesn't."

This is entirely incorrect. You can get velocities (Mezzo-forte, mezzo-piano, etc.) and such is expressed if you hover over the note itself in Guitar Pro or Tuxguitar. Sure they change the granularity of it, but the general range remains the same and for all practical purposes sounds the same if played properly.


There have been dozens of suggestions for alternative notation systems over centuries. Many documented here: http://musicnotation.org/

I guess it's just inertia.


There are a lot of books on the subject, but the short answer is that it's just the most common standard at this point. Yes there are some things about it that don't make sense, but for whatever reason it seems to be the most coherent way for musicians to communicate using a common language.

It's popularity also has to do with what sounds pleasing to the ear (and brain) on a biological level.

A number of people have come up with alternative scales and notations systems over the years, but none of them have really stuck for one reason or another. Nonetheless, they are pretty fun to read about.

here's the whole history of notation https://en.wikipedia.org/wiki/Musical_notation

Also, if you aren't familiar with John Cage, you should check him out. His music and writing deals with a lot of the stuff you just brought up, and it's also a really great jumping off point to find other interesting artists and musicians.

Indeterminacy, a work he did with David Tudor is a great starting point https://www.youtube.com/watch?v=_lOMHUrgM_s


I once thought the same thing, but after months of studying our music system and our way of notating it, I came to understand why it's so difficult to improve upon staff notation.

First of all, Western music has complex structure both horizontally and vertically. This makes it rather difficult to encode and visualize, right at the outset. You need some sort of matrix visualization, like a staff or piano roll, to capture all of the nuance.

What makes the staff so useful is that it also captures the tonal aspects of music in compact way -- those that relate to the key the music is written in. Every triad in the same inversion looks the same in every key. A triad is three consecutive lines or spaces. And then deviations from the standard triad for that tonal function are marked with accidentals.

This turns out to be extremely useful for performers, because you learn to play an instrument by learning to play in all the keys, rather than learning what the 12 notes are and playing note by note. I realized this when taking piano class and doing exercises where we'd transpose to another key while sightreading in the original key.

There are other notation systems that have been as successful as the staff, but they tend to be specific to particular instruments or styles. For example, most guitarists find tablature much easier to play than standard notation, especially if the tablature is augmented with note durations and rests.

Also, although I've become a true believer when it comes to the staff, I have less rationale for why the traditional clef system has stuck around. It seems like something that is more regular as you go up and down the scale would be more helpful. There are systems that use things like note shapes or colors to help mark the note name. I guess we just haven't found a standard.


I'm a programmer/musician and I can read on guitar and piano. I'm somewhere on the middle on this debate. I really dislike conventional notation but I also agree that the alternatives have some big downsides as well.

My biggest objection to conventional notation is that it gives a profoundly misleading picture of how music and harmony really work. It defines one reference key (C Major/A Minor) with a certain pattern of steps and gaps, starting on a certain note. Then for all of the other keys you add more and more sharps or flats until you get into ridiculous keys where all 7 notes are modified. The truth is that there's just one evenly spaced set of 12 tones, and all it means to be "in a key" is that you've picked a certain note out of the 12 to start the pattern on. There's nothing special about C. We could have chosen the key we call F# as the reference key and named it C, and everything would work the same.

It's hard to overstate the damage from this. Lots of musicians I know—serious players, people who took music in college—still think of "complicated keys" and "easy keys" and are only vaguely aware that the keys are actually all the same and they're just being tormented by the notation and terminology. I'm teaching guitar to a friend who was first trombone in high school and it blows her mind that she can play the same scales starting anywhere up and down the fretboard and it sounds the same.

It all comes from the design of the keyboard, where the notes of C major are evenly spaced (white keys) and the sharps/flats are stuck in between. There's also the fact that in the past the 12 notes weren't evenly spaced, so the different keys really did all sound different back then.

Conventional notation does have one big advantage, though: every line or space represents one note in the scale. This is more how musicians think: you don't care that much about the notes outside your key, and having the other ones "tucked away" in between makes it easy to see what's going on. That's why it's so quick to read once you know it. Out of the hundreds of alternative notations, I haven't seen one that's both key-neutral and also makes it easy to see things in terms of scale degrees.

(One idea I've had is a 12-tone staff with Sacred Harp-style shaped note heads to show you what scale degree you're playing. Not sure if that's ever been tried.)


I agree with your main point -- standard notation is basically just piano tablature and it tends to confuse as much as it enlightens about how music works. However, I disagree about the "there's just one evenly spaced set of 12 tones" bit. This is a simplifying assumption of standard notation that makes it hard to express the idea of notes that are outside of the well-known 12.

Even in the key of C major, this is a problem in just intonation. Say you want to play a G major chord, so it's made up of G, B, and D (3/2, 15/8, and 9/8). Later in the song you want to play a D minor, so you play D, F, and A (9/8, 4/3, and 5/3). That doesn't sound right, though. It turns out that the D you want is actually 10/9, which is just a bit flatter than 9/8. In standard notation, you can't distinguish.

It's possible to get around this by adding non-standard modifiers to notes aside from the usual ones (sharp/flat/natural), but unmodified standard notation misleads people into thinking that those two notes are the same. Which is another example of your main point, that "standard notation gives a profoundly misleading picture of how music and harmony really work".


I agree that intonation matters a lot; I guess I'm thinking about how you'd make a notation that better conveys the information already there in the current scores, which is 100% equal tempered. I actually like the idea of modifiers for microtones, and presumably any of that stuff would work just as well on a 12-tone staff.

Also, with a 12-tone staff plus shape notes, you'd get a little extra information for just intonation because you can tell for sure what key was intended for a given note.


DAWs don't use the classical notation system, neither does this tutorial, so I don't understand the context of your comment.


Well, technically some DAWs have a notation view (Logic and Sonar, for example), but it's pretty much useless.


Since music notation is a form of communication, wide adoption is a huge factor in what is considered better.

We could come up with more precise and effective languages than the ones we naturally speak, as well, but the good-enoughness of the ones we already have and the fact that others around us are very likely familiar with them is more important. Utility trumps quality, and worse is better.

That said, if all you want is a different notation system for you to use personally or with small groups of other proponents, there are plenty to choose from. ABC and MML variants use letters for notes and numbers for note lengths, for example. Probably not optimal for sight reading, but maybe better than staff notation when writing or transcribing music. There's also trackers and piano rolls. Neither is very good for quick conprehension, but maybe lay things out in a way that makes more intuitive sense.


One advantage of the 5-Line Staff is use of both lines and spaces. It's compact, easy to print, and easy to stack notes vertically.

Another advantage: each note of a diatonic scale is mapped injectively. Cf. representing each line (or space) as a whole-tone, which leads to hash-collisions (e.g. "is that a G or a G#?"). Each note on a line (or space) on which collisions occur would need an accidental. Which defeats the purpose of key signatures.

A diatonic scale contains an odd number of unique notes. The fact that C lies on a line while C' lies on a space is an unfortunate artifact of representing a 7-note scale with alternating lines and spaces.


shameless plug :) lightspeed, the sightreading flashcard game.

http://buzzcola.github.io/lightspeed-music/

Requires windows and a MIDI keyboard.


> I always wondered why musicians keep up with the conventional musical notation system, and haven't come up with something better (maybe a job for a HNer?).

Is this supposed to be satire? Invoking Poe's Law on this one


> I always wondered why musicians keep up with the conventional musical notation system, and haven't come up with something better (maybe a job for a HNer?).

Me too. But you think about it, all you really is a graphical representation that describes the pitch of sounds relative to each other as well as their duration relative to the beat. And the conventional notation is not bad at it !

The current system is essentially:

a dot on a coordinate system representing the pitch, duration, and position of the sound in a sequence of sounds.

- a horizontal position axis: you draw an invisible x-axis representing the position of the note in its ordered sequence. It gives no indication on its duration.

- a vertical pitch axis defined by western notes (do, re, mi, etc): You draw your pitch lines, y-axis with y=Do, y=Re, y=Mi etc.

- a duration axis (let's say it points towards you): We can't draw it for a 2d representation of music, so we'll project this coordinate on the time-pitch plane which is your staff. We'll decorate the dot representing the note w.r.t. to it's duration coordinate: say it's duration is half a beat, the the dot is a black filled circle; if it's a full beat then it's a white circle; it's its a 4th of a beat the it'll be a black filled circle with a hook. Etc etc etc.

And then you start making all the addition of music notation: blank for 1/2 beats, vibrato, tempo, etc

Now there is this choice not representing note position and duration on a single axis. That may very well be so it's easier to standardise and read probably. You could also choose to represent the duration coordinate with colour, would that make it easier ? :)

Maybe the problem doesn't come from the notation, but the system in itself. The half step between B and C, the 12 notes but really it's more, etc. That's why solfeggio is hard ! I think some greeks considered the study of harmony to be at least as intellectual as that of counting ! I wonder if there's an algebra for harmonie. An H-Algebra why not ?

But really, it's not the only notation: guitar tabs, guitar chord representation, etc


Tempered tuning indeed divides the octave into 12 half-steps, but a huge amount of music uses only 7 or fewer of them for long stretches (or entire pieces) with a few exceptions. So think of the lines & spaces as being a compressed representation that doesn't waste vertical space for the tones that a piece isn't going to use.

Me, I love standard notation. Common chord voicings and interval patterns stand out as easily recognizable patterns on the page.


I wonder how many of us "skilled musical technicians" there are - people who can read music really well, produce those notes on our instruments predictably enough to play in a group, but just aren't that "musical" - we're boring to listen to on our own and have trouble singing. I'm a competent flute player, but it's a good thing I was just as interested in computers as a teen.


They actually disucss that in this course. They talk about pelog scales and 19 note divisions of octaves (not 12) https://learningmusic.ableton.com/advanced-topics/pelog.html



Roman numerals may seem simpler than arabic, but turns out arabic are more convenient for complex operations like multiplication.


I think that you are approaching the notation from a very left brained kind of logical point of view. Once you learn to recognize the patterns in musical notation, none of these concerns actually matter. Musicians just see the pattern and play it, and then focus on the stuff that is really difficult, which is the musicality.


piano rolls are used instead in production. (source: I make lots of music - http://www.soundcloud.com/decklyn)


It is confusing at first, but once you memorize where all the notes are it is very good. Notation is based around the idea of key signatures, and once you have that down it becomes very intuitive and you can actually know what a piece of music sounds like just by looking at the notation. Western music has 12 distinct pitch classes, but typically the notes are used in scale groupings of usually 7 notes, with accidental "outside" notes being easily recognized by sharp and flat symbols. Doing it that way gives easy visual cues for musical "events" such as key changes, outside chords.. etc. There is a reason it has stuck around, it is a quite ingenious system.


I have mild dyslexia myself and I think any kind of notation is going to be a problem for us. The good news is you don't need musical notation to play music. You can play by ear. Don't let it stop you if you're really interested in music.


Guys if you haven't seen Sonic PI (http://sonic-pi.net/), this is also a great tool! You can write beats using a Ruby DSL and it runs them real time.

I sat down and did this in an hour: https://github.com/exabrial/sonic-pi-beats/blob/master/house...

Sam Aaron is the guy behind the project, he does a lot of ambient type stuff: https://www.youtube.com/watch?v=G1m0aX9Lpts


I wanted to come and post this. Sonic Pi is an amazing tool with a compelling raison d'etre that I would think resonates with the HN community. It's a realtime code as performance tool aimed at teaching kids programming but is also used by advanced users to create wonderful pieces of music.


I'm actually working full time on a new DAW that should make writing music a lot faster and easier. Current DAWs don't really understand music. Also the note input process and experimentation is extremely time consuming and the DAW never helps. Current DAW : my thing = Windows Notepad : IDE. The HN audience is definitely one of my core groups.

If you are interested, sign up here https://docs.google.com/forms/d/1-aQzVbkbGwv2BMQsvuoneOUPgyr... and I'll contact you when it's released.


I've also made a few iOS apps for the purpose of simplifying composition, though they're pretty limited in scope (on purpose). Although it seems most composers would prefer to use full DAWs from the start, I'm personally much more creative when I'm able to jot down and edit my fragmented musical ideas as quickly as possible, if only to make the initial draft. (If I were a better singer or musician, I'd just use a recorder or a looper — but my skills aren't quite there yet, and besides, it's hard to note-edit a recording.) Composer's Sketchpad[1] lets you paint notes directly onto a time/pitch canvas, bending and stretching them as they go along. (This works great for e.g. guitar solos.) MusicMessages![2] is a more basic piano roll that lets users quickly tap buttons across several layers to enter notes. (Musical bubble wrap! Works great for riffing on short drum sequences and chord progressions.)

There's another similar-sounding project called Helio that was posted a few weeks ago: https://news.ycombinator.com/item?id=14212054

I hope that in time, we get more Markdown-style composition tools vs. the full DAW suite. Good luck! I'm looking forward to seeing what you make.

P.S. AudioKit is pretty dope. :)

[1]: http://composerssketchpad.com

[2]: http://musicmessages.io -- working on turning it into a full iOS app, so will probably have to shut it down and fold it into the new app at some point


Sounds really cool! Actually, I'm close to releasing a (looper) DAW -- kinda geared towards live use, but I've thought a lot about composition too.

https://zenaud.io

Send me an email at mpercossi at zenaud.io , always fun to talk to fellow audio devs :)

And for all the vim lovers out there -- my app supports vi commands for movement and editing :)


That looks really nice... too bad there's no Linux support :/


Yes, it is a shame. But: I will add it, along with Windows support.

Indeed, I'll go further. I'm really starting to believe that the only way not to get royally screwed as an app developer is by abandoning the "major" platforms -- which all want to turn you into a serf -- and target OSS platforms like Linux. I'm honestly tiring of dealing with the artificial roadblocks Apple (and Microsoft is no better) throw at me to further their own ends. I actually analysed SteamOS with this intent, but sadly it looks like SteamOS is geared towards the "living room" experience.

Anyway, long story short: there will be Linux support in 2018.


Does it have per note editing? For example in trackers you can specifically set a note to play volume X, pan Y, pitch Z


zenAud.io is designed for live use, so it currently doesn't have a piano roll -- instead, you define record loops using editing tools in the arrangement view and record MIDI or audio into it. You can also drag and drop to import standard MIDI files into the arrangement if you want to use pre-written stuff.

I realize this is a big limitation, but we intend to add a piano roll in the next few months.


Well you're surely over-promising, here's hoping you won't under-deliver. Do you have anything at all to show yet?


What do you think is an over promise? The resulting app will be less than 10KLOC, discounting third party libs.

I don't have a demo yet if that's what you are asking about but I've open sourced this for example

https://github.com/adamnemecek/WebMIDIKit

Actually I do have some old demos but they don't show the best parts. It's actually kind of hard to show those right now.


>a new DAW that should make writing music a lot faster and easier.

>Current DAWs don't really understand music.

>Current DAW : my thing = Windows Notepad : IDE.

It really sounds like you're promising a lot.


Have you seen e.g. synfire?


I'd be really interested to hear the concept of how you are making things more IDE-esque


I think my idea of a perfect music program is closer to vim than an IDE, but you're on the right track.


Have a look at extempore, a lispy live music/notation language and environment. Only emacs bindings, no vim, but impressive preformance nevertheless...

[0]: http://extempore.moso.com.au/

[1]: https://github.com/digego/extempore


I played around with this a while back and there are Vim plugins for it. My biggest problem was having to compile the thing from source which involved also compiling a custom version of LLVM, which took forever. It's possible this is no longer a problem.


I'm aware of extempore; it's impressive. I actually use SuperCollider quite a lot (in vim), which is not lispy (more OO) but in a similar space. But what I want is something that can operate on music how vim operates on text, not just operating on music-written-as-text in vim!


IMO, classical notation is the Vim of music - it looks bizarre to outsiders, it's totally unintuitive, it requires a lot of memorisation and practice to use effectively, but it's extraordinarily efficient in the hands of an expert user.


How is it efficient? What exactly is the alternative?


It's very quick to read and write. It contains all of the vital musical information in a very concise format. Numerous alternative schemes for musical notation have been tried, but none have achieved significant adoption.


Use vim to compose an abc file, then play it with software listed at http://abcnotation.com or https://en.wikipedia.org/wiki/ABC_notation


Have you looked at Lilypond? http://lilypond.org/


I have. "LilyPond is a music engraving program, devoted to producing the highest-quality sheet music possible. "

I don't need or want any of that. In fact when I write music, music engraving is the least of my concerns. Actually music engraving is generally the least of my concerns period.

Also I find the current music notation to be kinda idk outdated. I can read it, but I feel like it's a system designed by someone who had the mathematical knowledge of a 15th century farmer (which is probably how it came to be).


>Also I find the current music notation to be kinda idk outdated. I can read it, but I feel like it's a system designed by someone who had the mathematical knowledge of a 15th century farmer (which is probably how it came to be).

What specifically about it?

I can read (and prefer) standard musical notation but when handwriting I use Hummingbird [0] because I find it aides itself to handwriting. But I can't really imagine a "better" musical notation than what is the standard today, except a better way to communicate natural/flat/sharp notes.

[0] http://www.hummingbirdnotation.com/


This is a long discussion. But fundamentally music notation is very paper oriented and doesn't exploit the advantages screens offer.

> (and prefer) standard musical notation

Prefer it over what?


I wanted to bring up Hummingbird notation with a specific context in which I prefer it (handwriting) while still being clear that I prefer standard notation over Hummmingbird as a whole.


There's a lot of vim too :-).


Great. To be specific, what I dream of is something that can operate on higher-level musical constructs analogous to vim's text objects (works, lines, parentheses, etc). Chords, scales, rhythms, melodies? I don't really know exactly what this would look like but I suspect it could be very slick if someone got it right. I had some ideas and thought about implementing them a while ago but it got deprioritised next to making my own music and had to get to the back of the "some day I'll..." queue.

I've signed up to your google form, so I'll look forward to seeing what, if anything, you come up with :) I am on linux (and yes, I agree that music on linux is a pain), so I might not get to use it unless you port it, but I still look forward to seeing it, whatever it is.


Yes, you get it! That's exactly what this is. You sound exactly like me lol. I love higher order things and I've been chasing this "mirage" since I was like 12 but I never had the chops and time to really devote to this. Do you think that we could chat sometime? My email is my username at gmail.


What don't current DAWS understand about music?


Same thing that pencils don't understand about writing, and paintbrushes don't understand about art.


I'd argue that current DAWs expect the user to understand at least something about music. Sounds like OP is working on some "syntax-aware" features for their DAW.


Also, if this DAW understands something about music, will it constrain me to its understanding about music?

Most of what I write is highly dissonant or straight up microtonal.


> will it constrain me to its understanding about music

This is actually exactly what I'm trying to prevent. Most of the current solutions only kind of constrain you to a certain tonal space that you can maybe explore but the space of possible compositions is actually insanely large. My DAW is going to try to help you explore all that.

Microtonality is definitely something I've thought about and I think I can make it work but I'm curious to know what do you use currently to compose?


I lean towards Reaper the most as far as DAWs go.

Often I'll use http://www.huygens-fokker.org/scala/ and my synths and a fair bit of SuperCollider/Overtone.


Isn't knowing something about music something of a prerequisite for someone who wants to make music? Of course everyone has to start somewhere, but as musician of 20 years who loves DAWs, I would say learn an instrument first, or at least concurrently, if you want to start producing music.


The thing is that once you learn the music theory, few DAWs let you leverage that to be more productive.

Also why do you have to learn music theory first, why can't the DAW teach you as you go?


Mostly they don't understand well all the possibilities outside of typical meter and tuning systems. They can do some but tend to push you to writing 4-beat meters in 12-note-equal-temperament. Rhythm and pitch both have far far far more possibilities which DAWs either ignore or at least make second-class options you have to kind of fight for or try a few limited tastes.


Better question is what do they do understand about music?


Well can you tell us about that? You're the one who made the claim.


Check out synfire, it's the only sw that somewhat similar. but it's very expensive, and the ui isn't great (sometimes it looks like writing music in excel, like it can provide "intelligence" but you have to check check boxes and click on things, aintnobodygottimeforthat.maymay. Some people might find it interesting that it's written in Smalltalk tho.


Can you explain what a DAW is?


A DAW, or Digital Audio Workstation, is to building music what Final Cut Pro is to building video, or what Eclipse is to developing software. Most DAWs consist of multiple tracks which hold multiple audio clips, each of which are scheduled to play at a certain time. You build a song out of these clips, which you have loaded into the DAW. You can also add various effects to the clips and manipulate them. You can store abstract music event data ("play note A here at this time, then play note B Flat at this time") in additional tracks. This data has no sound associated with it, but like a piano player roll, you can set it up to play notes in some instrument, either an internal software instrument provided by the DAW or a third party, or emitted via MIDI to a remote hardware music synthesizer.

DAWs are used to produce the huge majority of music you hear in the media, from commercials to hip hop songs. Even seemingly real orchestral pieces for movies are often composed entirely using artificial instruments. For example, here is Junkie XL showing how he composed themes for Mad Max Fury Road.

https://www.youtube.com/watch?v=VkNeXS0Lmxc


Digital Audio Workstation. Think Ableton Live, Apple Logic Pro, Avid's Pro Tools, FL Studio, Cakewalk Sonar, Propellerheads Reason... tools to record/arrange/produce/master music.


They are all different beast that once you learn one you don't want to relearn another.

The only difference is that Abelton Live and Bitwig (Runs on Linux) are designed for live performance.

I like Reaper (Cost is 1/5 but equally capable) and it also runs reasonably well under Linux. https://linuxmusicians.com/viewtopic.php?t=15280

Actually many people never pay for a license it has a similar model as Sublime Text.


>The only difference is that Abelton Live and Bitwig (Runs on Linux) are designed for live performance.

Ableton, at least, also functions perfectly fine in the traditional piano-roll and timeline paradigm of DAW workflow too. Don't let the 'Live' part of the name mislead you into thinking it's only for live performers; it does everything the 'old DAWs' do, AND it's got great features to assist in live performance.

Also in terms of underlying concepts, if you know one DAW well, you can usually learn another one fairly quickly, as it becomes more a question of learning the interface more than anything else.


> if you know one DAW well, you can usually learn another one fairly quickly

I couldn't disagree more, but I am talking about doing professional work. The concepts are all the same but getting where you are proficient in a DAW takes a very long time to find the quirks and strangeness that each one comes with to produce a quality piece.

Video Editors are hundred times harder to switch.


Ableton is probably the best DAW right now simply because it has the most tutorials online.

I remember way back when I used Cubase. Couldn't find any decent help online.

With Ableton, you are spoiled for choice when it comes to tutorials and lessons.


Quick google: Digital Audio Workstation (DAW)

Many comments here mostly mention software. But there are some interesting exceptions. Check Surgeon for example, who likes to use his custom controllers with Ableton. You actually can see him re-wire the controllers every now and then. (Great music too ;))

https://youtu.be/KM558N6PJmY


Will this be for electronic music?


Ofc! Im working on this cause I wanted to make some electronic music but none of the current daws really let me express myself the way I want.


Cool, I added myself to your list. What would you say makes your interface different than others?


It's gonna be clean and fast, no clutter. Recently there was this on HN https://github.com/peterrudenko/helio-workstation which kind of scared me cause my UI is somewhat similar (but after thinking about it more, I actually find mine a lot better). Also note that the UI is only like 20% of the whole thing, the thing that I'm really trying to improve is the workflows. I will make Hypersphere the fastest DAW in the world when it comes to expressing your ideas

When I'm in the zone, I don't care about check boxes. I have some new user interface paradigms that I haven't seen done before (I can imagine they have been tried before tho) that should make writing music super painless and should let you express yourself.


Good answer, thanks! I look forward to seeing more and hopefully watching it blossom as it grows. Cheers!


When you say HN audience would be one of your target groups, do you mean that your DAW would be more like a development environment/programming language (like Sonic Pi), or would it have a more traditional interface?


Both actually :-). And those aren't even all of the "composition paradigms" and they are all first class citizens in the UI.


Awesome, I'm stocked to get my hands on this!


Sounds intriging. Any chance it would work under linux?


I kinda wish it would but audio on Linux is such a pain. I think that porting it won't be too bad once it's done but I'm not promising anything.


Can you share some more details? I'm interested, but hesitant about putting my email into some random Google form.


It's gonna be the fastest piano roll. It will have semi live performance kinda like Ableton live but live is sample based, mine is music based. I don't want to reveal too much, but I've talked to professional composers and kind of described the work flow I envision and they all were like "i need to this asap".

Idk if this will appease some of your concerns but I've been around hn for a while (I'm in the top 30 karma wise), I won't spam you.


Have you toyed with live music programming ? just curious

Also nice endeavour


So I'm working on this mostly to scratch my personal itch. I'm aware of those but to be honest I never found them to be more than toys. When I listen to music made in these, I feel like they generally lack some structure. My thing is all about helping you structure things.


Aight. I wasn't comparing btw, just wanted to have your point of view.


Just FYI there will be a small JS programming environment in my thing.


Any further details may be?


Are you interested in anything particular? I can provide a lot more detail but I think that none of that will do it any justice. Sign up and check it out when it's out.


No offense, but it sounds like an empty sales pitch. You're trying to bring attention to the product you're building - and there's nothing wrong about it - but you're only presenting only vague promises, without even discussing what do DAWs get wrong about music in any significant detail.

In the spirit of constructive criticism, may be you could at least point to specific negative sides of existing DAWs that you're willing to eradicate?


Will this have any support for external VSTs like Massive?


ofc, This is kinda standard. Generally I'll try to go well above and beyond what possible today.

Note that I'm on the core team of AudioKit https://github.com/audiokit/AudioKit which is a platform for AudioUnit development so I know all about how dope plugins are :-).


Hi, I make audio plugins. Let me know if you need OEM plugins for your DAW.


Hm this would have come in really handy a while back lol. I might take you up on the offer still.


>Also the note input process and experimentation is extremely time consuming and the DAW never helps

What is so arduous about plugging a midi keyboard in?


I can think a lot more complicated music than I can play. I don't always have access to a keyboard. Piano is a good instrument but sometimes I want to write drums. Also sometimes I want to express relationships between the single notes, not just have the notes themselves. To record 10000 notes from a piano, you need to hit 10000 keys possibly more than once to record them. My thing will let you achieve the same thing with less than 10000 actions.


I agree with you with respect to representation. Pattern/sequence generation is something most DAWs don't have outside of something like Cthulhu which the languages can do easily.

Another thing I've been dying for is an easier way of layering sounds, for example drum hits. Multiple midi sends feels hacky in Ableton and certainly not a first class feature. On the other side of things, the pain of rearranging multiple wavs after wanting to change a note is even more painful.

I totally agree with you about the actions though. Configuring plugins etc can be a huge drain and its very mouse heavy.


I'm not sure if I understood what you meant with layering drum hits. If you mean being able with a single trigger to have layered samples go off that form a drum hit like a snare or bass, then some drum samplers come to mind like Geist. That's the one I personally chose due to wealth of flexible features, but layering a group of samples into a single "hit" is a foundational feature in the program.


> Another thing I've been dying for is an easier way of layering sounds, for example drum hits. Multiple midi sends feels hacky in Ableton and certainly not a first class feature.

Layering drum sounds is a typical feature in all DAWs, and in ableton, with it's instrument and drum racks, it's even easier to layer whatever you want. May be I don't understand what exactly are you trying to achieve?


I see - so almost like you could have a macro that knows certain changes / scales and just hit :dorian or whatever? That's interesting...


Exactly. Be assured tho that you are just barely scratching the surface :-). Sign up above if you haven't, it will be dope.


Check out Jack Schaedler who works in this at Ableton https://jackschaedler.github.io/

He even made an interactive essay about the GRAIL text recognizer from the 1960s https://jackschaedler.github.io/handwriting-recognition/


I found the Circles, Sines and Signals to be amazing too. An awesome introduction. https://jackschaedler.github.io/circles-sines-signals/


I'm an amateur musician and one of the things I hate about electronic music is how "distant" it all feels.

I'm used to picking up the guitar, playing a few chords and writing a melody.

Ableton (or any other DAW) feels like a chore. I have to boot up the computer, connect the MIDI keyboard, the audio interface and the headphones, then wait for Ableton to load, then create a new track and add a MIDI instrument before I can play a single note.

I know the sessions view in Ableton was an attempt to make the music feel more like jamming, but it doesn't really work for me. A lot of musicians who play instruments I've talked to feel the same way.

I would love an "Ableton in a box" that feels more intuitive and immediate.


I made a pair of iOS apps to help bridge that gap: Composer's Sketchpad[1] and MusicMessages![2]. Above all else, I wanted to make note editing and navigation as simple as possible. At least for the basic ops, there are no tools, no modes, and no switches: what you see is what you get. Unfortunately, this also means that the apps are more suited for rough drafts than anything resembling full, polished pieces. But I've found them remarkably effective for notating quick rhythms, chord progressions, and even solos! They're very simple — there's not even MIDI support yet, only the built-in SoundFont — but I hope to add depth to them over time without compromising on their simplicity.

[1]: http://composerssketchpad.com

[2]: http://musicmessages.io -- will remove from sale to turn into a full iOS app at some point in the near future


If you're an iOS user, I highly recommend the "Music Memos" Apple App that nobody talks about. Pretty great. Even better if you were to get something like the Rode iXY to go with it!


If it is just about "start playing", then it sounds like you are just looking for a hardware synthesizer or groove box? You don't need a DAW to play an electronic instrument, for many they only come in when they start recording.


Sounds like you need to get a modular synthesizer :)

https://www.youtube.com/watch?v=V5Z0R9DS4u0

Or, as an intermediary step, like another commenter suggested, an OP1:

https://www.youtube.com/watch?v=umatbZ0n4mE

llllllll is a fantastic community of people building/playing with new & experimental musical instruments:

http://llllllll.co


Maybe the OP-1 would be of interest to you?

https://www.teenageengineering.com/products/op-1


Computers are powerful but boring as hell when it comes to creating music. Lately physical machines have been getting popular again (drum machines, modular systems like eurorack, etc). The computer is then used to glue everything together (mixing and mastering)

You can do electronic music without a computer and with the immediacy of an acoustic instrument : pick up a Volca or an Analog Rytm and you can go a long way (some machines are quite expensive, but hey, guitars and piano are, too)


No reason they have to be. I recently ordered a couple 'Bastl Kastles' to use as oscillators and LFOs (patching them into Xoxboxes and/or each other). Part of the reason some of those old things like DINsync hardware are prized is retro nostalgia, but the other reason is they're using primitive timing systems that have less jitter than the more sophisticated centralized MIDI systems. If you're using DINsync, all your instruments are playing their own sequences and a timing tick is just keeping them aligned. If you're running MIDI then you have to send actual note descriptions over a 31.25 Kbaud serial bus before you can get any sounds out. If you're sequencing from a DAW (not doing realtime processing) and daisy-chaining instruments, forget it ;)

I'm currently getting a bank of Delta Labs Effectrons together for delays: old delta-sigma (like SACD, but potato grade) digital delays. It's possible that popularizing this would make Effectrons trendy and pricey, like the EMS VCS3 synth is trendy and pricey… but Bastle Kastle kits are dirt cheap. There's no reason there can't be a delta-sigma digital delay kit with the same functionality as Effectrons and the same or better sound in relevant ways. Tools don't have to be expensive these days.


I only have basic knowledge about these old things, so thanks for the info. While MIDI endured the effects of time quite well, as a developer it is a pain in the ass. Every device (or DAW) handles things a bit differently.


Check out the Novation Circuit. It's a brilliantly fast and intuitive sketchpad for electronic music. You can turn it on and be making music in minutes, without reading the manual. It runs on AA batteries, so you can doodle ideas anywhere. It isn't a replacement for a DAW, but it isn't supposed to be.

https://novationmusic.com/circuit/circuit


I've used all the software DAWs. They are a chore. They're all terrible. Extremely cluttered UIs, and way too much to learn to do the simplest task. By contrast, I always found hardware DAW touch-screen interfaces a joy, because they strip it down to entering and editing notes, everything is tactile, and they sound great.


I found Multitrack Studio to have better UI than rest of DAWs, but it's not so geared towards eletronic music. They also have pretty good support for touch-screens. Too bad there's no Linux version, but that's hardly surprising for multimedia software.


That "jamming feel" is sort of what they were going for with their "Push" line of controllers. Of course, you still have to do all of the things you said about launching the application, but I think that's akin to setting up a drum kit or hauling your amp out of the closet and tuning your guitar.


Perhaps you'd be happier with a fully featured keyboard, no computer required? They make some great stuff these days, and it all feeds back into the computer pretty easily if you come up with a riff or loop you like, and want to put it into a proper DAW for polish. Just flip a switch and you're rolling.


The Octatrack is a fun box that can basically replace Ableton: https://www.youtube.com/watch?v=yjavTXRvZBE


I purchased the Ableton Push 2 a month or so ago and it has to be one of the most beautifully engineered pieces of equipment I have ever used. Look up the teardown video. Extremely simple, yet elegant. The Push 1 was created by Akai, and apparently Ableton wasn't satisfied, so they designed and built their own.

https://www.youtube.com/watch?v=YItWQdJgXLs


Push 1 is still awesome and there's a ton on the second hand market for a fraction of the price of Push 2. You get way more than you pay for at those prices.

I'm happy with mine, I like the pads and it's fine for sequencing and playing. It still receives updates and improvements.

I do like the screen on Push 2, it looks like a nice update, but you are paying a lot for that screen, so you better use it!

I also own a Maschine Mikro MkII by Native Instruments, it's my go-to machine for finger drumming beats and sonic experiments... great pads, very precise, compact and enjoyable. The Maschine software is very good, and the add-on sound packs are great quality.


Having tried both a ton before settling on the Push 2, the screen is great, but that's not why I bought it. In every way it feels better to make music on than the first Push. The pads especially are a thousand times more responsive — the Push 1's got this weird wooden feeling that never seemed to register the velocity I was going for.

I think the Push 2 is ~3x as much, and it is worth every penny.


I was shocked at how sensitive the pads are on the Push 2. A graceful brush of a finger triggers the pads. I know that sensitivity can he adjusted, but wow! I produce classic Hip-Hop / Rap music which involves sampling. The screen is insanely helpful when it comes to chopping samples.


I have been listening to some of the Maschine patches on YouTube and am very impressed.

EDIT: Wow the Mikro is very affordable!


Also a fan of the Maschine Mikro, though I don't like the Maschine software very much.


Oh man I've had mine for I think a bit over a year now and I love it so much. It really builds off the fundamentals mentioned on the first page of this shared post: you jam out little riffs and then mix and match them till you get something you like. There have been many days/nights where I've needed to do some programming on my game and instead burn 3-4 hours without realizing it because I hit a groove. It's so conducive to jamming it's unbelievable. Not to mention Ableton is a really powerful piece of software that doesn't get in your way when you're experimenting. The pairing makes it one of my absolute favorite pieces of hardware (that and my OP-1).


I am scared to get the OP-1 because it looks like TOO much fun! Yes, I find myself jamming with the Push, not really making anything in particular. The scale mode helps me as I have little music training. Ableton simply works for me. I am using Push 2 + DDJ-SR / Serato and couldn't be happier. Although Serato has a LOT to be desired (pitch shift + fx + vst support), it just SOUNDS great.


Related, this is trending on reddit this morning. Just fascinating to watch someone build a catchy track up on such a (apparently) basic piece of equipment...

https://www.youtube.com/watch?v=FK5cU9qWRg0


It's a cool video, but not really a very basic piece of equipment.

> That's actually a $900 synth/sampler/effects processing unit called the Op-1 by Teenage Engineering.

From this thread on /r/ArtisanVideos:

https://www.reddit.com/r/ArtisanVideos/comments/6a2yq4/guy_m...


Not to mention that OP-1 stands for overpriced-1, you can do much more with way more affordable gear.


What's especially impressive is that the OP-1 has only four tracks, and only does destructive recording with no undo. Meaning that if you want more than four instruments, you have to overdub the same track, and any mistake will ruin the current loop.

In other words, you really need to know what you're doing and have a really good idea of what you want to play to be able to make a song as elaborate as this.


As someone who has no musical talent whatsoever, I'm oddly intrigued by Ableton's products. I've occasionally stumbled across the Push[0] and been fascinated by it as an input device.

This site is another thing to add to my Intriguing Stuff list.

[0] https://www.ableton.com/en/push/


Here's the thing: music is a lot like math. The more you practice it, the more you find yourself becoming a "natural" at it.

I used to be terrible at math. Then I found a teacher who basically made me work every day on math problems.

Within a few months, I found that I actually enjoyed maths, and was even good at it.

Of course, without natural talent you're not going to become the next Bowie, but even a bit of practice can make you surprisingly competent.


I have a Push (v1) and it's a lot of fun. The whole idea is to pull out the functionality from Ableton onto a tactile device, and I think it succeeds in doing so. When I play with it, I stash my laptop to the side beecause you don't really need to stay glued to the screen while working with it.

Beyond its utility, it's a really beautiful device.


Are you able to use the Push as a generic input device? Could it drive something other than the Ableton software?

I can think of a ton of things I'd like to do where having a custom "keyboard" would be awesome.


Yes you can. There's a dedicated button that puts it into "user" mode, at which point you can use it as a general midi input device. Every button and encoder is mapped, including aftertouch. You can program the LCD display and set the colors of the pads (via midi sysex messages).

I'm not sure how programmable the display is on the Push 2, as it's a different display tech. I imagine you send it snapshots of a framebuffer rendered on your computer.


Me too. I own a Push 2 and Ableton Suite and I just practice and listen to a lot of music. Soon I will have some tracks to put up on sound cloud.


This looks strangely similar to a collaborative app I made last year with Elixir/Elm/WebAudio API:

https://www.youtube.com/watch?v=TCVuLh5Io9A


Get Started Making Music (In Ableton Live).

Love the simplicity, though it does seem to favor EMD (for obvious reasons).

I've always loved the idea of using Live in a live improvisation context, potentially with multiple instruments having their own looping setup; or just a solo thing. It's hard to find that sort of thing, though.

Checking out Tone.js now.


There are a lot of bands that have been using Live for performance. Right now Sylvan Esso is my favorite of this new instrument.

https://www.youtube.com/watch?v=ELNiiAldfyM

PS The song is radio and has everything right about why I have always HATED mainstream pop radio.


Just learned about Sylvan Esso, thanks!


They put Tone.js to good use. Promoting Ableton by showing what cool stuff you can do with free js library that can work in browser, weird? https://tonejs.github.io


To all the people complaining, I feel you. There is not one tool that takes you through the entire workflow of making music well, but they sell software pretending they do support the entire workflow. In truth, you write and arrange in specialized notation software, create samples in specialized synthesis software, or record live audio, then you use audio workstations to fix, edit, transform, and mix. Even there you may rely on external hardware or software plugins. These tools aren't meant for a one-person creator. They mimic the specializations in the music industry. A good all-in-one software simply does not exist, and small teams trying to work on these projects are trying to bite off a real big pie. It's very complex and requires a lot of specialized knowledge, and many of the pieces are probably patent-encumbered, too. But good luck!


The timing of this post is funny, as just this week I launched a little ear training game built with React an Tone.js: https://www.notetuning.com/


I first saw a link to this Ableton page from a group I'm in on facebook on April 24, so Ableton's had this up at least since then....


this is great. thanks!


The first page of that tutorial reminded me of a product I saw at the Apple store a few weeks ago called Roli. They have a great app [0], but the hardware [1] itself is not ideal but unfortunately necessary to unlock some features... I will be waiting for a v2...

[0] https://roli.com/products/noise

[1] https://www.apple.com/shop/product/HKFR2VC/A/roli-lightpad-b...


I was looking for an app like this for my son. He started with "My Singing Monsters" and some music lessons at school, but when I tried to get him into Garage Band it was too much for a beginner.

Thank you to the creator ... I will show it to him later today. I am not sure how far he can take it, but I like what I have seen so far.

Also, if anyone has other suggestions for music-making apps for tween kids I am all ears ...


I'm sorry for posting this several times in this thread already, but... I made a "music painting" iPad app called Composer's Sketchpad that sounds like it would be a good fit. It's not made for kids, but it did win a Children's Technology Review Editor's Choice Award last year. Maybe up your alley? There's a Lite version, too: http://composerssketchpad.com


Interestingly enough, my 6 year old was able to drive in and start having fun drawing in MIDI notes in Ableton Live Intro (the creators of the app linked for discussion) and listening to them.

Ableton Live is certainly complex, but there's so much fun that can be had if you or your children can focus in on one feature and enjoying the musical accidents and exploration.


I think the design of this is really interesting.

It's designed in a way to make the user (e.g. anyone who likes music) just want to play with it in a way that's very intuitive via its simple, visual layout. And it provides instant feedback that makes you want to continually tinker with it to make something that you like more and more.

Web development/programming training tool makers should really take note of this.


Wow this is super high quality content. Props to Ableton. By far my favorite DAW, but I wish they would come out with a cheaper license.


Cheaper than the $99 Intro license?


Intro has a limit on how many tracks you can have. Which makes it more or less unusable for my purposes.


I can't speak for other DAW's, but Ableton was really easy for me to pick up as a complete novice to digital music production


I agree. A lot of the other UIs are insane. One of them boasts about its full physics package to accurately render the cables connecting one "device" to another. I thought it was so gimmicky but a lot of them do the same stuff and people buy it so what do I know. Ableton just makes sense.

But maybe because I'm not an artist. I just like learning these tools. I will say that with a few hundred bucks of equipment (a Launchpad and a Kaossilator2) I've had hours of fun just "jamming".

Also for more technical fun, there are 3rd party MIDI loopback interfaces available on Windows, so it's easy to write your own instruments. Took about an hour to hook up an Xbox360 controller so I got a few x-y inputs. Ableton makes it super easy to map them.


The Ableton hate is pretty unfounded. Most people use third party synths and effects anyway so who cares about what the DAW comes prepackaged with.


Wha? Ableton has great built-in effects and Sampler + Operator are fantastic instruments. Some of my favorite producers use almost exclusively built-ins. You can get really far with just a few stacked Operators + saturator + erosion + overdrive + multiband dynamics.

All I've felt the need for so far is a better limiter (you can't really push Ableton's) and a multi-band distortion plugin.


Plus those instruments are developed by AAS [1] - a company known for their top notch physical modeling instruments. They are very powerful indeed.

---

[1] https://www.applied-acoustics.com/


> Took about an hour to hook up an Xbox360 controller so I got a few x-y inputs. Ableton makes it super easy to map them.

Fun tip: You can do this with an Xbox 360 (probably Xbox One) Rock Band drum kit for a cheap e-kit. The drumkit just sends gamepad input and there are programs to convert that to MIDI.


The Kaossilator 1 (KO-1) is available on ebay for under $100 and they are massive fun. Never a boring train/plane journey with a KO-1 and some headphones.


It's pretty easy to get the grid idea, but when you start trying to do anything more complicated Ableton turns out to be a mess, with many limitations and arbitrary weirdnesses.

E.g. you can't send MIDI sysex out of Live (except to the Push 2 controller). That kills it for all kinds of hardware automation and advanced synth programming.

Live has no concept of a mono track, so it wastes a lot of DSP resources processing effects and mixes in stereo for no reason.

There's no simple hybrid clip arranger mode, which is something most other DAWs can do.

MIDI clip files come with an audio preview. The clips aren't associated with any drum sounds. So you hear the preview, think "I like that...", load it, and then you have to spend half an hour picking the right drum sounds for it.

And so on. I've really tried to like working with Live, but there are just too many design decisions that make no sense for it be to anything other than frustrating.


I started in that mode (since I tend to do long drones) but once I adjusted to the non-linear workflow it became super easy. It helps that the ecosystem around Ableton is so rich now.


What does "hybrid clip" mean? Audio + MIDI in the same track? Live's arrangement view + session view in the same screen?


FL Studio has a nice learning curve too, though different. FL's best features IMO are the pattern sequencer and the piano roll. So intuitive, once you get the DAWs flow you just get creative and the limits disappear.


I moved from Cubase to FL two years ago. I'm in love with the patterns in FL!

The only thing I really miss was the feature in Cubase where you could add effects to an audio pattern in a destructive way. You could create really complex and glitchy patterns and it was easy to mix them together (cross fade and such).

The other thing I would love ImageLine to do is a better workflow when you use audio samples directly in the sequencer. Things like fade in and fade out and a much bigger zoom overall in the sequencer to move samples around.

Edit : I know you can work with Edison but it isn't intuitive imo


Ableton Live is my main daw. I use it every day, generally for hours, and for a wide variety of purposes.

The most depressing thing about ableton is made obvious in two seconds of messing with that tutorial. A complete disregard for music in the sense of pushing boundaries of time, or doing things that are not tied to any sort of grid, and the sense of music as an emotive form.

So many aspects of music are very annoying or borderline impossible to do in ableton. Yet in all these years, and with so many installations, they just never addressed those issues. Instead they vaguely pretend as if music that would require features they don't have is radically experimental. Which might become true if so many people learn music only through using their software.

Seriously, Ableton. Stop pretending making music is clicking on and off in little boxes. It's embarrassing.

--

Edited to take out the "art" part and put in a couple of more specific criticisms.


Hate to be a pedant... but I'm not sure any one person gets to define what "music in the art sense" even is.

Yes, Ableton is rigid. Yes, Ableton favors certain musical styles over others. Yes, Ableton, here and in their design of Live, may just be giving lip service to anything beyond rigid song structure, tempo and dynamic changes, etc. Yes, Ableton loves their little boxes.

But, I find it hard to believe that Ableton has a "complete disregard for music in the art sense." If "art" inherently means "unquantifiability" or pure "aesthetics", then sure, session view Live might not be your best bet. Regardless, arrangement view is basically the same as Pro Tools as far as I can tell/remember from what I've used of Pro Tools.

What is Live missing for you?


I think either my phrasing was terrible (likely) or people aren't seriously considering what I am saying.

Try just looping a sample that is of arbitrary length, not some multiple of beats. This is something that we could do fairly easily since the 1980s, and with moderate effort before that. Ableton made this in to an unusual technique.

The entire arrangement view only superficially resembles protools, the automation, the time stretching, everything really, is completely different.


I don't understand, it's easy to get arbitrary length loops, that's the reason there is a "fixed length" button on the Push 2 which you can enable if you want.

The arrangement view is not meant to be Protools.

Ableton is actually quite nice for doing experimental and unusual music, it just is built around doing it LIVE. If you want to paint outside the lines, you can with it. "clicking boxes on and off" is simply a way of perceiving layering.


Those loops are still a number of beats. When I say arbitrary, I refer to things that may not be any integer value of beats.


You can disable quantizing and "warp" for clips (or just some clips if you like).

I sometimes use Ableton for noise shows, you can definitely get weird and off the grid with it. Record in 2 loops, not quantized so their lengths don't match exactly, duplicate each loop a few times. Disable warp on a few copies, enable repitching on a few others. Then when you twist the tempo knob some loops rise in pitch while getting faster, some just get faster but keep constant pitch, and some stay the same length and timbre. Record that to another track, then duplicate THAT and repeat, etc.


This takes like a second. Drag a sample into an audio track (in arrangement view), copy it, paste it again where the last sample ended, repeat until it goes on as long as you need. Do you not know how to use Ableton at all?


Ignoring the ridiculously insulting tone and just pointing out: using your method, what happens if I subsequently change the tempo in the file?


It breaks. But now you're just adding arbitrary conditions. If you want it to loop without breaking when the tempo changes, load it into Simpler, turn off warping, and create loop points around all of the audio that you want. And then create a single MIDI note holding it on for as long as you need. Or if you need something really unusual, load a looper plugin into Ableton--there are tons. You're acting as if problems that only require a few minutes of cursory thought are impossible to solve.


If it's not a multiple of beats, would it generally sound good in any musical kind of way?


I recommed Steve Reich - Music for 18 musicians :-) Composing in the DAW is the best way to make someone else's music. Create your own creative process, then you are on the way to your own sound.


Yes! So many examples of why.

What if I want the sound of rustling leaves to come and go across the course of a song?

What if I wanted my piece to happen over a drone tone similar in function to a tamboura or a constantly feeding back electric guitar?

What if I wanted to have another sort of loop and while playing it back, experiment with different tempos against it to see what sounds right?

And a zillion other things.


You can turn off auto-warp on audio tracks by default globally, and you can disable warp on a per clip basis as you go.

In your specific case... disable warping on your ambient track, line it up against your warped/synced "another sort of loop" in arrangement view, and then change the master tempo of the track until you find out what you want in terms of different tempos. (This is off the top of my head, and without the program in front of me. Apologies if it's vague.)

Live can definitely function as a "dumb" multitrack recorder that lets you do those things -- but, by default it has all the tempo/beat/quantize options turned on.

I saw your edited post above... I think people are reacting negatively to your criticisms because they're a bit harsh, and, I personally think it's shooting the messenger (Live). You can do the things you want to do in Live... but by default out of the box, it's not what's it designed for. Live lowers the barrier entry to making sound... the people/users cranking out 4-4 120bpm tracks likely wouldn't be making anything had there not been Live. If you don't want to call that sound art or music, that's on you. To many people, that's still music, and music they might not have created otherwise.

EDIT: Just saw your other post here (https://news.ycombinator.com/item?id=14300672) with some more specific criticisms. I feel your pain. You obviously already know how to do the thing I'm mentioning above, and are referring to more complicated scenarios. Thanks for sharing.


>What if I want the sound of rustling leaves to come and go across the course of a song?

Trigger it via MIDI as a one-shot in the sampler, for example. Sampler and Simpler params can be automated. If you want some 'arbitrary' looping, again use the loop points of Sampler and set the (re)triggering appropriately.

You can do all these things if you want to, you just have to invest enough time learning the tools.

Edit: If you respond, you may mention that you have invested the time. Perhaps you have, and I'm just misunderstanding your case or needs.


A good example of this is polyrhythm.


I feel the need to address some of the typical HN poison floating around. I've got a very expensive music education. It doesn't make me better than an "Ableton clicker". Music is open to anybody, and is defined by what moves one, not by the mechanism of its creation. America really tends to treat it as some exclusive art club (beyond required elementary school classes), and that's really just wrong.


Right on. Anything can be an instrument from a 5 gallon bucket to a $100,000 concert piano. I never understood the snobbery, even among hacker crowds even, for people using certain tools to create music. Heck my C64 made music to the tune of Daisy and it was super cool. If someone wants to produce live music I am all for that.

PS Punk Rock and Indie Rock still is better then whatever else anyone else listens to :) Just jokes.


Typical HN poison?

Please refer to my comment history and also please rebut specific things I said. I rarely say negative things and I am a user of this product all the time. I am genuinely frustrated by the lack of progress in recent years, since I use the product literally every day. This is not poison, it is legitimate criticism.


I didn't mean personal offense in my reply. "Stop pretending making music is clicking on and off in little boxes", is the specific portion I was referring to, but more generally I just aim to promote inclusiveness.


Music is made in your head, the DAW is just a tool to get it out. You can turn off grid snapping and use samples the way you like and many more creative methods to do whatever you please.

> So many aspects of music are very annoying or borderline impossible to do in ableton.

Like what? Scoring music? How about using a proper tool then? Nobody complains that they can't write hardware drivers in PHP.

My DAW is Reason and because it sucks in making mixtapes I got Ableton too because it's superior in that use case.


I really don't buy this. Ableton was created by an excellent musician (Robert Henke), and its design reflects a musician's perspective. It's intuitive and flexible. It doesn't work for every possible workflow, but no individual piece of software could.


That's arguing from authority and ignoring the meat of my point.

I know Robert Henke created Ableton Live and it was a tremendous innovation when it came out.

Over time it gradually expanded features but never dealt with a variety of things that might be development intensive but would in no way break the workflow.

For example:

Support advanced midi stuff. Like, any of it. Why is it so mind bogglingly inconvenient for me to hook up my continuum or my linnstrument to Live?

Loops of arbitrary length/non warped loops, as I mentioned in another comment, are unnecessarily hard to implement.

Why can't I midi map to certain controls, why can't I macro map to certain controls.

Why can't I easily lock stuff in the arrangement view so I am not terrified to change the master tempo.

Why can't I group things in arrangement and edit them together.

why can't I intuitively stretch a sample in arrangement view, when the whole sell of ableton is time stretching?

None of that would break the workflow.


It's not arguing by authority, it's arguing by counterexample. Robert Henke doesn't represent all musicians, but he is a musician, so it's not possible that Ableton doesn't reflect the perspective of any musician.

All the things you mention relate to a certain workflow. It's not my workflow, it's not every musician's workflow. It'd be great if Ableton had good SysEx support, but it's not accurate to say it has a "complete disregard for music" because it doesn't fit your workflow exactly. Ableton is designed to simplify working with repetitive beats. If you don't want any of those, of course it's going to be difficult to use.


You are taking a quote out of my sentence in such a way that it changes the point of what I was saying. I am confused about your motivation.

Do you think I don't know why Ableton Live is a great tool?

I do know it is great. And I feel I understand the tool deeply enough to give real criticism regarding unnecessary limitations.


For those of us who haven't used Ableton yet, what is it missing?

Edit: fixed autocorrect typo


It just encourages so much grid based work that you tend to lose perspective on the fact that most of the music made throughout all of the history of ever wasn't actually on a grid.

I love ableton a ton. It just encourages a limited mind set.


Not sure if that's a typo or intended...


> For those of us who haven't used abortion yet, what is it missing?

AutoCorrect fail? Or not-so-subtle criticism of the product? ;)


What DAW you would recommend? And what features do Ableton miss?


It seems to me each DAW focuses on a particular problematic.

- Ableton Live: focus on painless transition between live and written electronic music, quick producing

- Reason: focus on modulating everything with a studio feel

- Orion: focus on cleanly mixing looping music, slow producing

- FLStudio: focus on composing looping music

- Reaper: focus on customization and performance

- Protools: focus on getting the most money out of the customer

- Logic: focus on getting the sound you want and making a song easily, pre-made effect chains

- Studio One: a bit similar to Logic in spirit

- Cubase: I don't know what it focuses on tbh

A major problematic is that the topdown goal of making one song may easily be dwarfed by mixing difficulties that can then capture your focus forever (aka "loopitis").

As an audio plugin developer, I get to know a bit of every DAW out there, for testing. A popular opinion is that tools don't matter for making music, but a more realistic opinion is that they actually do matter, by warping how you think about the problem.


> but a more realistic opinion is that they actually do matter, by warping how you think about the problem

Yes, and this is a more general problem with tools. In some cases, the tool changes your attitude towards the activity in very subtle manner.

A good example of this is the creation of the ballpoint pen and what it did to handwriting (https://www.theatlantic.com/technology/archive/2015/08/ballp...).

> Fountain pens want to connect letters. Ballpoint pens need to be convinced to write.

It takes serious effort at times to force a ballpoint pen to actually write. Which has the effect of turning off people to writing and ruining the quality of written material.

I find it best to always acknowledge that there may be a better tool for the task you're trying to do at any moment, and realize the point at which you should go and search for that tool.

But as Rumsfeld would say, the unknown unknowns are what you need to worry about. That is, the case where you don't know the tool you're using is limiting you, and in what ways.


> - Protools: focus on getting the most money out of the customer

Why am i giggling sooo much :-) The whole talk about it being a 'industry standard'.

I missed Renoise on your list:

- Renoise: Focus on appealing to former tracker users and demoscene music makers.


Renoise: incredibly accurate timing. Yes demoscene music makers, but it's also exceptional at producing anything that's got to have incredibly tight timing. It even excels at sending MIDI data without jitter.

Also, as the heir apparent of the tracker movement (its audio can be a heck of a lot better than the early generations of trackers, and it's happy to work with multichannel DACs for outboard analog mixing) it offers a distinct way of thinking about sound-making: that old 'tracker' way of composing. Since it's based on audio samples of arbitrary fidelity, it's not restricted to 'demo-y' sounds: you can just as well use huge 24-bit high sample rate sounds, or use it as a way of layering audio tracks as one might in a DAW.

I'm not involved with Renoise as a company, BTW, but I AM completely charmed by it and own both it and Redux (the DAW plugin version).


\o/ I found a Renoise user.

Yes a thousand times, dude, Renoise is that good. It's my main DAW and I feel really spoiled by it, specially by its incredible stability.

Everytime I read articles or forum posts about people talking about their DAWs I hear people complaining about ableton/fl studio etc being unstable, crashing on odd occasions etc. I have never experienced a crash on Renoise, not even using multiple VSTs etc, even some buggy ones that are prone to crashing or scrambling other DAW's internal state hehehe... I have composed tracks on Renoise for over 6 hours to only them realize I hadn't saved a single time.


I tend to recommend ... Ableton! I think the strengths outweigh the weaknesses, especially for beginners.

But they are trapped in a sort of convention that makes my life hell when I want to leave it.


Reaper is incredibly full-featured and free/cheap.


Reaper is optimized for live music, and, although it can do electronic music decently, its workflow is not tailored for it.


That's true. I should have qualified my response with that I pretty much exclusively do live instrument recording.


I noticed many people commenting here think there's only one page.

There's more -- scroll down and click next.


Over the years I like to think Ableton has been at the forefront of the digital music community (at least among the pack like Korg), at a special nexus of hardware, software, VST developers, and global sharing by way of an incredibly robust and deep Live Suite program. Seeing the firm continue to reach out and share community resources is habitual for them, and I'm very pleased to see this get all sorts of attention from this community. The intersection of Technology and Art is a bright, multi-cultural future, and with that comes responsibility. To put it in a phrase, this is an example of Ableton providing a ladder up to new members, rather than slamming the door behind them once a certain level was reached. Enjoy!


Like any technology there can be lots of different inputs and outputs. I think it is safe to say that Roland and the TR808, 909, 303 changed music notation, and music forever, with their popularization of grid based music programming. It may be that Ableton is doing the same with their software. Each year the tools get better to do these sorts of creative activities. The Beatles recorded Abbey Road on a giant 4 track expensive four track owned by a record label. In 1995 I saved up my money from a summer job and bought a 4-track cassette recorder for about $500. Now you can get a four track app for you mobile phone for about $5. Or download an open source one for free.

YAY :)


Off topic, but I posted the exact same link about 24 hours earlier: https://news.ycombinator.com/item?id=14291332

Not that it's important but I'm kinda curious why a. my submission would only get 7 points and b. how it was possible for someone else to submit the same link so soon after and gain the points rather than my submission getting boosted?

It it just random chance/time of day of posting? Or is it because the user who posted this had more points to start with and so was more likely to be "noticed"?

Awesome site in any case!


Myself and two friends have tried to make music production easier (and more robust) on the phone in our spare time, and came up with our iPhone app, Tize (https://itunes.apple.com/us/app/tize-make-music-beats-easy/i...), to that end.

If it sounds like something you're interested in please give it a go! We're always working to improve it and open to feedback. (Android is coming soon)


seems like a blatant clone of iMaschine?


This is beautiful and amazing. I love how each step builds on the previous, and uses pop examples to explain theory concepts. I've often wondered so many of the things presented in this, particularly around what common characteristics a genre has with respect to rhythm! Big kudos to the team who built this. I'd love to learn about the development backstory, as this feels a lot like an internal sideproject made by passionate individuals and less like a product idea dreamed up with requirements and specs.


I've been using Ableton Live for about a week after getting a free copy with the USB interface I bought (Focusrite Scarlett 2i2, highly recommend) and I had to turn to YouTube to figure out how to actually sequence MIDI drums in it.

I use it pretty much solely for recording, but I take advantage of the MIDI sequencer functions to program in a drum beat instead of recording to a click, because I've found my timing and rhythm is so much better playing to drums than it is just playing to a metronome.


I did music at GCSE and A-level so I knew about a lot of the basic theory here, but it's fallen out of use in the past year or two. The best part of this by far was the deconstruction of tracks that I like into their components and realising that they're not insurmountably complicated. Kinda like a musical version of "you could have invented monads".


I wanted to build something similar for mobile to make music on the go. I started it here (abandoned now, but code is linked): http://buildanappwithme.blogspot.in/2016/04/lets-make-music....


The following really helps understanding the difference between electronic music genres.

Legendary Ishkur's Guide to Electronic Music:

http://techno.org/electronic-music-guide/ (requires flash)


If you want to get an interesting take on the 'Live' part of Ableton Live, look for 'Kid Beyond Ableton' videos. He builds up tracks live on stage by beatboxing all the instruments, and uses something called a Hothand recently as his controller.


I think I've watched this video a ton of times: https://www.youtube.com/watch?v=eU5Dn-WaElI

That guy is using Ableton Live to re-create a popular song of The Prodigy.


Did this get voted 1023 points (so far) because, it's a great article or does everyone love music? Btw, I use Ableton after my Pro Tools rig was stolen and, I'm buying a new MatrixBrute. I can't wait to checkout this site.


Similar concept using Daft Punk samples instead: http://readonlymemories.com/ plus some filtering and looping capability.


If you like that, you'll love Madeon's Adventure Machine - http://www.madeon.fr/adventuremachine/


It reminds me "Generative Music Otomata" http://www.earslap.com/page/otomata.html


This is extremely comprehensive for any beginner/intermediate musician/composer, and I'm really impressed at how they managed to implement the content in a mobile friendly manner!


Love it. Great web app from a really good company. I use Ableton a lot and I'm continually impressed with their software and content marketing activity.


I used to be a professional musician and I've used a lot of real Ableton equipment and I still found this incredibly interesting and fun.


This is really awesome. They really went the extra mile on building this out. It even supports multi-touch screens. Very well done.


Wow, this is super impressive. I fell in love after adding a few chords over drums. Amazing.


It'd be nice if we can share the stuff we make in the playground with friends.


Wow, this looks great. Is there an app for this? I'd love for my son to try.


Check out Auxy for iPhone for a really simple way to get started with sequencers.


I belive its based around their flagship, Ableton Live [0].

[0] https://www.ableton.com/en/shop/live/


This is AWESOME! Sharing it with all my friends!

Thanks OP!


So much for being productive today...


Yep, not using hottest framework, not a SPA, not a PWA. Just something that loads fast and works great. Good job.


That's fantastic!


This is amazing!


This seems like the wrong place to start. This seems like the place to start learning a DAW and snapping together samples—to, IMO, make depersonalized unoriginal loop music in a society awash with it because DAW's and looping have created an angel's path to production and proliferation. Learn to drag and drop and you can tell people you meet that you're a musician or a producer. I've met too many mediocre people like this. There should be a disclaimer when this page loads: learn to play an instrument first. Bringing forth music from a physical object utilizes the body as well as the mind, attunes to nuance, and emphasizes that music is primarily a physical phenomenon. It's also just fun and you can jam with or perform for friends. This cut and paste and drag and drop and sample and loop mentality popularized by the rise of hip-hop has lead to an oversaturation of homogeneous, uninspired, unoriginal sound in society. Maybe I'm old fashioned but I think people should spend long, frustrated hours cutting and blistering their fingers for the craft, at least at first. That builds character and will show in your music as you move on.


I think there is a bit of projecting going on here, while I agree with you in a lot of aspects. The market is going to dictate what's going to be produced, just because a product is hard to make doesn't mean it's wanted or objectively good. There are plenty of starving jazz musicians that know way more about music compared to what's currently selling and not just because the bourgeoisie deems it so, because that's what people want to listen at this point in history. John Coltrane is a legend and there are similar jazz musicians out there, but people want the chainsmokers and there isn't anything wrong with that. We could argue that making art using svgs, illustrator, other animation software is incredibly unoriginal and the world would be better if we taught people to use a paintbrush instead. Times change, but that doesn't discredit one or the other.


I want to partially agree with you because there is just something about physically playing an instrument that is so rewarding and fulfilling, to both the artist and the audience blessed enough to witness a true master guitarist/pianist/etc in action.

However, this is too narrow a definition for music. Having some experience with both actual instruments and DAWs I think they exercise different parts of the musical brain and as an aspiring musician one would be more "complete" having knowledge of both. But I don't think one place to start is more necessary that the other.

It's absolutely possible to be a fully developed musician and have the DAW be your only and only "instrument". It's a different skillset and the best producers in the world can create brilliant music just purely in the DAW. There is another subthread here that comments on the difficulty in generating unique sounds, for example. There are lots of specific skills that separate the weekend loopers from the pros that have found their own sound and character. It just takes a different appreciation. The latter set of folks definitely have paid their "frustrated hours cutting and blistering their fingers" just in a very different way.

I will say though that this form of music isn't very interesting to me in a live format vs traditional instrumental music, but the produced/composed/recorded outcome can be just as brilliant.


> This seems like the wrong place to start. This seems like the place to start learning a DAW and snapping together samples

I agree that it makes sense, in a lot of cases, to start with learning an instrument, I do not think that replaces the information on this website. This is more about composing than anything. Sure, the first page introduces the user to "snapping together samples", but the next page introduces the user to what a drum beat is. Going through these pages I can learn to compose music, not just triggering samples. Nobody says that a site like this is a replacement for an instrument, rather, it's something additional that anyone could gain some insights from.

While there's an option to use patterns from the site in Ableton Live, this information is completely DAW agnostic, and could also be completely useful for people that have no interest in creating music with a computer.


> make depersonalized unoriginal loop music

Seems like the art fits the times, then. We live in a society where it's easier to "like" something on Facebook, or "friend" someone on Instagram rather than talking in person to another human.

In any case, I disagree with almost all of that. Almost all music you hear today is digital, filtered through electronics, and coming out of mass-produced speakers. Often in some giant shopping complex, in Your Town, USA.

Why write on Hacker News using a computer when you should be writing with quill and ink to your local newspaper? Assuming you aren't against Johannes Gutenberg and his heretical ideas.


Maybe you haven't developed an ear to distinguish between the good and bad? Walk down my college town bar scene and it's awash with dudes who have poured years into an acoustic and it all sounds the same to me... except my buddy Dave (who has the same feelings toward beats as you BTW and honestly I think it's because his guitar is a social crutch...)

Personally I've found DAW style copy and pasting, along with a skeleton set of quality effects, has provided an infinite space for learning and creation. I would never strut around calling myself a musician, but I am a graphic artist and the overlap between the fields is significant. In fact, I attribute my talent for wallpaper and tiled pattern design almost entirely to my loop making. Want to learn motion graphics? Muck about with some jazz drum loops.

I'd say there is enough overlap between beat production and all sorts of fields to warrant a curious mind to explore this world without a silly stigma.


I use DAW's as well. I make ambient music and sound collages as a hobby. I have a large loop and VST instrument collection and have purchased and used numerous apps. My argument is that to have a thorough understanding of music one should be well-rounded and begin with analog; one should become proficient at an instrument. I can't imagine not knowing how to play guitar, tune it up, improvise on it, play it in real time using scales, affecting its sound with my musculature, feeling the infinite nuance of the striking of a single note by how you strike it with said musculature. I can't imagine not having spent time in garages with friends communicating through instruments. That education is invaluable and will deepen your understanding. I just think starting in an artificial realm is a bad idea for a proper music education and people will benefit more by learning music in-the-world, at least initially. This isn't to say that doing this won't make you the open mic guy doing drivel Dave Matthews covers on an acoustic, as you mention.


You're a musician, you're just not classically trained in traditional instruments.

It's the same as a modern photographer who doesn't know how develop film, does it make you less of a artist because your photos were developed on a LCD screen. If anything, it frees you from spending time in a darkroom and instead have more time shooting photos.


Having assisted a photographer who spent weeks coming up with ideas for and planning sometimes single photographs we traveled like 100km to take, a both very gifted and work obsessed person, I am absolutely firm in "not being a real photographer" even though I do have somewhat of an eye for it, if I may say so myself. That's not to belittle what I or others do, but my way to tip my hat. I cannot use one word to conflate all that, she paints with light, she thinks long and hard about what she will paint and how she will do it, she knows her palette, and doing right by what ever is going on in her head as well as she can consumes her. I am not that way, and I doubt the people who make most of the photos we see on the web are that way.

And yes, she could do that 100% digitally too, she uses digital if need be, and the big frame camera stuff she scans anyway, to mangle in photoshop. She just also spent her nights in the dark room she had built in her apartment ^^

Have you ever met someone who is passionate about dancing? The kind that kills your knees by the time you're 30? Imagine saying to them "oh, I'm a dancer too", just because I sometimes dance at home or at the club. In a sense, yes I'm a "dancer", in another, heck no. And the distinction I'm looking for isn't covered by "professional photographer" at all.

Oh, and I also feel that way about the "music" I make with trackers. I put it in quotes not because I don't listen to it for hours on end with a grin on my face, but because I just derp around until I like the result. I know how seriously in contrast I take the lyrics I write, that's an entirely different game; but the "music" really could be the way I make it or a million other ways, I don't care too much. I'm easy to please and lazy in that regard. Everybody has to decide that for themselves, I'm not trying to delineate "serious art" or define "art" in general, but still, if you'd call me a musician I'd say that feels subjectively wrong, I don't want that label, it's a bit too big.


For me, I disagree. I'm a fairly untalented guitar player. I spent 2 years with several teachers and self-learning and I can't really play anything. I can improvise a bit, knowing the Minor Pentatonic scale for example, but I can't really re-produce anyone else's music. I also have a lack of motivation to do so; after all, someone already wrote and performs that.

On the other hand, I am able to write my own music using tools like Guitar Pro, Garage Band, music sequencers, etc. I can even play that music if I'm careful to write it inside of my limited knowledge of the guitar. I really wish I would have been introduced to an easier way of creating music when I was younger so that I first became a hobby music builder and later moved on to playing.


Not really sure exactly what you're saying but I think this says a lot: "I also have a lack of motivation to do so." So don't. Compose the equivalent of Rite of Spring without the motivation to even play a single instrument proficiently and I'll concede instruments are irrelevant and DAW's are an appropriate starting ground for actual musicians.


I consider anyone who creates music a musician. Some of my favorite sounds are the ones created by amateurs in the subway. My favorite guitar players are self taught. I won't be composing Right of Spring, but that's not what music is about for me.


Shameless plug, but have you heard of Magic Instruments[1]? It sounds like the bottleneck is the guitar's user interface and not your music ability. One of the features is what we call Magic Mode, where a single button is a chord and buttons are laid out in any key/scale, but it can also be played like a traditional guitar.

[1] www.magicinstruments.com


Wow. I have to say I hate this. This dumbs it down for sure. This is just one step up from guitar hero, which is at least premised as a game. It sounds like there is just no room for nuance and personality in that guitar. SMH.


Thanks for the feedback, this isn't the first time we've heard this exact comment, I'm going to try and start a constructive dialog.

> Wow. I have to say I hate this. This dumbs it down for sure.

I can see how this dumbs it down for people in the beginning, but where does a beginner learn how to play an instrument with a 90% failure rate in the first year? It's clear that the problem hasn't been solved, otherwise more than 6% of the US population would playing a musical instrument.

> This is just one step up from guitar hero, which is at least premised as a game.

Partially correct, it's as easy as Guitar Hero in the beginning, but the only correlation is that there are buttons instead of strings on the fretboard. Otherwise, it's an entirely different product. We're a computer that is shaped like a guitar.

> It sounds like there is just no room for nuance and personality in that guitar.

While it's fun to think that guitar made of wood with tensioned strings and metal has more personality vs. a guitar made of plastic and metal, without trying the instrument yourself I beg you to defer your criticism until you try it.

Kind of like electric cars vs. ICE cars. I'm a car geek through and through and there is nothing like slamming your foot into the gas pedal and going fast. The smell of petrol and the noise from the engine/exhaust are the visceral traits people talk about when they drive their ICE cars. Does Tesla lack personality and nuance because you don't smell and hear the same things? I like to think it's a different personality because I still get the same goosebumps when you switch to insanity mode and floor it.


>I can see how this dumbs it down for people in the beginning, but where does a beginner learn how to play an instrument with a 90% failure rate in the first year? It's clear that the problem hasn't been solved, otherwise more than 6% of the US population would playing a musical instrument.

I don't see that failure rate as a bad thing. People who don't want to play guitar that bad will "fail," which also could be known as moving on to something else (?). How is this guitar a stepping stone to actually playing a guitar though? I imagine most people will simply stop at this guitar and use it as a party trick, which is fine; I'd even play around on it if it was on hand. The furthest this guitar can go in teaching someone to play an actual guitar is the matching of the strumming hand to the fretting hand's basic location on a fret board, to say nothing of the fingering (pushing a button doesn't come close). In other words this guitar it seems will get you like... .5% of the way to being proficient at an actual guitar and really no more. The strings on this guitar seem oddly loose, like rubber bands. Just an observation. I'm sure you guys have a reason for this. More catch for untrained thumbs, no blistering? The only way I see this being a step toward real guitar playing is if someone has a really good feeling about being able to hold a guitar shaped instrument and being able to coherently produce sound and decides to give it a full go. But the reality is that making the leap to a real guitar will lead them to run about against the same wall that everyone runs up against: the pain, blisters, hours of infuriating attempts at fingering, having to learn shapes, names of notes and chords, having to learn to tune, read sheet music or tabs, listen to songs to learn them by trial and error, etc.

>Partially correct, it's as easy as Guitar Hero in the beginning, but the only correlation is that there are buttons instead of strings on the fretboard. Otherwise, it's an entirely different product. We're a computer that is shaped like a guitar.

Exactly. So learning to play this is learning to play a computer shaped like a guitar, not a guitar. Like I said, if someone gets a good feeling out of this and that's what it takes to take the plunge into the long hours of pain that guitar learning is then that can be a positive. Though I really cringe at the thought that someone might be fooled into thinking they've arrived at a replacement for a real guitar.

>While it's fun to think that guitar made of wood with tensioned strings and metal has more personality vs. a guitar made of plastic and metal, without trying the instrument yourself I beg you to defer your criticism until you try it.

Okay. I don't see how you can suggest that this guitar is capable of any nuance. It's a brute chord computer. The muscles of a hand on a real guitar with a pick or thumb can operate in such minute and multifarious ways that as far as I can see cannot be done on this guitar. You can play a single note on a real guitar in so many different ways. You can strike the string with varying intensity, you can palm mute the string, you can use varying pressure with the fretting finger on the note, you can vibrato, bend, and so on. Maybe I'm missing something. Can you do slides on your guitar? Bends, vibrato? The basics. Is picking your strings vs. finger-picking different or is the signal just interpreted the same? Can you influence the timbre in any way other than using that knob? Can you post a video of someone doing a solo on your guitar? Doesn't seem possible. I guess you can use it to write basic songs if you're a songwriter. I guess on your guitar you can fret chords at a speed that is inhumanly possible, which is sort of interesting. If I played this I'd be tempted to treat the chord buttons like individual notes to come up with something outlandish. But I don't get how you can insinuate that the guitar is capable of doing what a real guitar can, if that is what you're doing. I take back my "hate" comment. I don't hate it. I'd mess around. But my criticisms above pertaining to your insinuation stand.

>Kind of like electric cars vs. ICE cars. I'm a car geek through and through and there is nothing like slamming your foot into the gas pedal and going fast. The smell of petrol and the noise from the engine/exhaust are the visceral traits people talk about when they drive their ICE cars. Does Tesla lack personality and nuance because you don't smell and hear the same things? I like to think it's a different personality because I still get the same goosebumps when you switch to insanity mode and floor it.

I don't think that analogy works. A Tesla has all the functionality of a combustible car. Your guitar doesn't seem to have the same functionality as a guitar, has most of the functionality cut out, paring it down to chords, and that's saying nothing of the sound of the actual chords, which sound like my first $80 electric plugged through my first $50 amp. There's not a rich tone there. Maybe you could sell it with an amp with effects.

https://youtu.be/SdGYBI1fBhs

What would McLaughlin do with your guitar?


This is not the basics of making music. It's a super advanced technique using a computer. The real basics involve pencil, (staff) paper, and hard work. Downvotes please.


If you're taking a hardline-traditionalist approach to what constitutes music, why stop at pencil, paper and hard work. Music has existed long before written notation and documented musical theory. You're not only taking an incredibly western-centric view, but you're ignoring a large amount of music by doing so.


Yikes, don't tell Dave Grohl, Jimi Hendrix, Stevie Ray Vaughan, Eric Clapton, and the countless other musicians who don't know how to read or write sheet music...


One could argue that the basics involve putting sounds together in the simplest way possible - something like this. And that it takes far more work and skill to get to the advanced, low level stuff like writing on staff paper (and on the production side, compression, EQ, mastering, etc.).

To put it another way: Is learning x86 assembly the "basics of computer programming"?


You may be new here, but asking for downvotes has been in the guidelines for ever.

Please don't bait other users by inviting them to downvote you or proclaim that you expect to get downvoted.


I have a musician friend who doesn't think drums are a musical instrument and that drummers are not musicians. As far as I can figure it's has primarily to do with drums not having the same concepts of pitch/melody.

I feel like this is along your line of reasoning- only certain (classical, western) ways of doing music is real music.


There is sheet music for drummers.


I somewhat agree with your first sentence, but it's not a super advanced technique, either.

It's simply teaching rhythm and orchestration utilizing a step-sequencer. Similar to how beginners to music are taught rhythm and division of the beat, or Common Core Math grouping.


A more appropriate title would be: Get started triggering samples.

Making music is really something different IMO.


I'm not an artistic musician, but it feels really satisfying to make something that is my own, even if musically it's really simple. Your comment is nothing but unnecessarily elitist.


There's space for all kinds of music, and for some, triggered samples under a grid is considered music.


Making music (also EDM) is really, really hard. Most people that try will fail, and Ableton knows that. They are just trying to sell their product here which is fair enough. But 'getting started making music' is like you only need their product, trigger some samples and you can be an artist too.

Try sit behind a drum kit for the first time, you think you can start making music? Most aspiring drummers need to practice for years before you can play a reasonable beat. Same with EDM, it takes many years of practice and improving all kinds of skills. This tutorial is just showing how you can trigger samples in a grid and how you can put this together in a DAW. For me that is not making music, sorry.

Btw, the resulting loops in the tutorial were not made by beginners, and listen to the result, is that music to you? Would you buy that? Another title that would be appropriate but not sell, could be: Get started making rubbish.


To be fair, as soon as you start messing with swing, it's no longer strictly a "grid" you're limited to. Timing and breaking the rules is where musical talent starts dancing with grids and boxes.

Ableton even recommend thinking beyond machine-perfect rhythm. https://www.ableton.com/en/blog/get-swing-drum-programming-t...

I find it funny the term "humanizing" is used in regards to adding swing. I guess no worse than a term like "moderate rock".


> But 'getting started making music' is like you only need their product, trigger some samples and you can be an artist too.

I mean, is that wrong? Unless you stick to some elitist definition of artist, why is someone who plays with these samples to create something not being an artist?


It's not wrong, just feels misleading. For me it's like Sublime text comes with a little tutorial with the title "Get started making apps", and only give an intro on making a hello world program in C++, to sell their product.


I don't get the problem with that either? Being an "artist", or "making apps" aren't hallowed titles. Are you treating it more like they're saying "You'll be a super successful musician" or "You'll be a multi-millionaire app maker"? Because I'm no seeing it like that.


Samples are just instruments, and sequencers are just composition tools. It's the end result that matters.


Oh so you're familiar with EDM?


Yes, studied music on the conservatorium and have been a professional producer for years doing pop and edm etc.. Used most of the known DAW's from FL to ProTools.


It's really not. Some of the most beautiful songs are just artfully arranged samples.

https://www.youtube.com/watch?v=_h5WJ6J6IlE


Everything is a remix, https://vimeo.com/14912890.


Amazing presentation. Concentrates on the content, works on mobile[0], no bullshit effects.

[0] within the constraints of Android's embarrassingly crappy audio subsystem


Am I missing something? I went through all the tutorials and AFAICT there isn't much here. It seemed like "here's a piano. Here's some music made on the piano. Now bang on the piano. Fun yea?"

Is there really any learning here? Did I miss it? I saw the sample songs a few minor things like "major chords are generally considered happy and minor sad" etc... but I didn't feel like going through this I'd actually have learned much about music.

I'm not in anyway against EDM or beat based music. I bought Acid 1.0 through 3.0 back in the 90s which AFAIK was one of the first types of apps to do stuff like this. My only point is I didn't feel like I was getting much learning in order to truly use a piece of software like this. Rather it seemed like a cool flashy page but with a low content ratio. I'm not sure what I was expecting. I guess I'd like some guidance on which notes to actually place where and why, not just empty grids and very vague direction.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: