Considerations for improving the initial musical learning experience.
Here are insights that are relevant to using digital technology, standalone or as an adjunct to other methods, to help students understand music theory and applying it on-instrument.
Additional to my love affair with the guitar and music and improvisation over the last few decades, the following three areas fascinate me. How can they be applied to both application and content design, to reveal the simplicity inherent in music theory?
- Music psychology and learning
- How to minimise complexity in system design
- User experience design
We’ll briefly discuss these below.
Before recording was invented, the only way to share music was via music notation. All the detail was present. For many, this means the wood (musical concepts of theory) cannot be seen for the trees (the embodiment of the concepts in the notes, along with performance details, such as articulation, dynamics, phrasing and so on).
These days, technology offers alternatives for presenting the concepts simply without this accompanying detail. This can provide another way for teachers help their students build a solid mental framework on which to tie together associations between the various concepts. For example, concepts such as pitch, intervals made by two pitches, and rhythm, as points in time where sounds start and stop, give a simple basic framework that everything else can build on, in a structured way (motifs, phrases, chords, scales, and so on).
Music psychology and learning
This is mind-blowing. I first got into this when I wanted to know how pitch was defined. That led to me looking up perception, and that started a journey of exploration into how on earth air molecules moving backwards and forwards that enter our ears end up with our brains emotional response and appreciation of music as it unfolds ?
Some very interesting facts come out from all the research that’s been done. For example, the brain absolutely hates randomness … it’s always looking to make sense of the environment, and this has some obvious consequences for musicians, especially those that are starting their musical journey. In particular, the sooner that the use of musical phrasing is appreciated, and how to properly use the notes in scale, the better. It can be very demotivating otherwise, from trial-and-error attempts.
Other very interesting observations are on how all children learn. Their first experiences after birth are all about touch and sound. Then vision comes along. Then speech develops imitating their parents. Reading has to wait. Children also have a good sense of rhythm (in fact our bodies are all about rhythms and synchronisation, from breathing, to heartbeat, to circadian rhythms. Children recognise melodies, and certainly know if their favourite melody gets changed or sung out of tune by their mother. They can memorise tunes, and sing them.
What does all this tell us? We all (unless physically or cognitively impaired) have an innate sense of hearing musical relationships (intervals and larger musical ideas built from intervals (phrases, verses, songs, sections of an opus) and an innate sense of rhythm. Should music notation be the initial or early experience students have, as this contradicts nature. Should relationships (such as chord tone to chord root, or chord to scale), and not notes, get more attention? Imitation is vital, accompanied by explanation.
I heartily recommend the following two books, that can almost be read like a novel (there are many others, more academic ones):
- Music, the brain, and ecstasy. Robert Jourdain.
- Sweet anticipation. David Huron
Applying lessons from system design to explaining music theory
I’ve spent decades working in the design and architecture of many types of complex system software (CAD/CAM, computer languages and compilers, computer graphics, device emulation, virtual machines, embedded systems …). So I learned to appreciate a design is easier to understand and maintain when each different part of the design takes care of some specific job, with unnecessary detail removed, and with as little knowledge of the other parts as possible.
Having studied music theory and applied it for decades, I thought life can be a lot easier for us musicians by taking a similar approach, combined with using widely familar visualisations. In the context of music, maintenance means keeping foundational musical concepts present all the time.
Huge simplifications arise from:
- Deprioritising note names. An isolated note (no other context) is meaningless. More than one note always form relationships (and have the same sound flavour when transposed)
- Removing note names when discussing note combinations, and instead use intervals.
- Removing note names when discussing musical relationships (e.g. chord to scale), and instead use intervals
- Make available for query when wanted (e.g. what note does this string/fret or piano key produce)
- Present intervals both as semitones (obvious mapping to instrument) and using theoretical names (major 3rd etc)
- Removing rhythm from any discussion about note combinations, initially.
- Removing actual notes when discussing rhythm, where appropriate.
- Visualising rhythm using rectangles denoting duration
- Edit rhythm by changing rectangle widths, chopping up, gluing together
- Drop melody selections, or chords, into the rhythm
- Visualise rhythm during playback
This is where using digital technology shines, as these simplifications are all entirely possible when we make use of computer graphics, sound, and multimedia. This approach can help students quickly get to the foundations of music, and be used standalone for teaching it, or as an accompaniment to other teaching methods.
For example, we can visualise interval combinations extremely simply, by using a very common item … a clock face (or watch face). Below, G major scale is laid out along the bass E string. The red circle on the guitar denotes the scale start note (tonic). It appears twice. The numbers labelling the various scale members show their distances in semitones from the nearest scale tonic below the scale member.
Below the guitar neck is a clock face, with times 0 (midnight) to 11 AM. Notice how the pattern of coloured circles on the clock (“occupied times”) and “unoccupied times” are mirrored on the guitar neck.
The clock times also has 12 spokes around it, with orange-coloured spokes at times 0, 4 and 7 AM, indicating a chord rooted at 0 AM (scale start note) found in the scale.
Click the image to enlarge it.
User experience design
UX design is critical. I remember seeing a wonderful picture that captured the essence of UX design. It showed a baby in a cot, with a bunch of small fluffy animal toys hanging from a horizontal ring that the parents had put together. It then showed the view as seen by the baby … a bunch of animal rear-ends! There’s a lesson there.
Here’s some of the ways we’ve addressed the UX for helping you get familair with musical concepts. Firstly, the design is a mix of visuals and sounds. It carries theoretical information such as interval and note names and scale structure that can be changed with a key command. This is just one aspect of functionality available very simply (usability). Menu design gives access to the usual things such as scales and chords. Simple key commands let you visualise scales in different regions (similar to CAGED) and the mouse can be used to draw out a retangular area to lay out the current scale or chord in. By holding down the ALT- key over a scale note on the guitar, you hear and see the appropriate chord rooted at that note. Content-wise, the design enables interactive multimedia lesson and practice content, for testing the understanding of musical concepts on-instrument, with a virtual machine that knows how to correct mistakes on-instrument. This is vital for supporting student interactions (for example, “convert this major chord to a minor chord”). While the design separates rhythm and notes, it also enables the design of pure rhythmic concepts, that can with a single click be filled with a note pattern from the instrument.
There is also a fully-fledged video-based help system, which when turned on, shows relevant help for a button, icon, menu item and so on.
By applying the simplifications mentioned above, in particular avoiding (or de-emphasising) note names, and avoiding mixing rhythm with notes when discussing note combinations and musical relationships, a lot of clarity is gained, which can be taken to the instrument on day one (with the obvious issue of lack of dexterity at that time). In particular, intervals are better appreciated if presented using semitones between the notes involved, rather than their theoretical names initially. This means liteally anyone can create a scale on-instrument in literally minutes without knowing a single note name or where to locate them. (S)he can hear and play what’s being presented, and start experimenting,on day one. That has proven to be very motivating and encourages learning more. Technology allows all these simplifications to be made. This approach can then be used alongside traditional methods including notation, if wanted.