Food for thought; editing where the skater lands tricks on the beat has gotten easier.
Editing used to be done on an editing deck; you put your raw footage tapes in, and manually record clip by clip onto the edit with the song, which was another tape deck. All decisions and planning needed to be done beforehand. This is linear editing; you can't go back and make changes. The first editing software was also linear.
Then non-linear editing software came out; you could now go back and make changes to everything you did. You could better plan for clips to hit on a beat. But, on earlier non-linear editing software you still had to worry about rendering times. You make one change, render, and 30+ minutes later you can review your change. How much of that can you tolerate?
I remember Dan Magee even talking about this render delay problem when editing Waiting for the World, which came out in 2000. He made decisions and stuck with them because too much time was wasted if you nitpicked tiny details.
Now, fast forward to today. Make changes, and the software is seamlessly rendering as you work. No delays. You can stress out on whether the beat should hit 1 or 2 frames earlier.
So, in sum, this phenomenon was very much influenced by available technology.
Edit: watch the intro to the Trilogy Menace section to see a linear editing bay. My first experience editing was on one of these in high school (I'm not that old... the software already existed).