Before diving into "real research", there are some students who prefer to learn bare minimum prerequisite courses/material and then start off reading survey papers and develop their skills.
There are others who prefer to go through courses from Introduction to Advanced to Independent Study before finally diving in.
The former would probably face problems of fundamentals while the latter spends too much time learning things which might not be completely useful.
At what point should one (Assume, if necessary, that I am talking about STEM) decide to dive into research while making sure that one is neither being too hasty nor slow?
Take for e.g. that a person wishes to code a software in Python which does engineering calculations. He would either read something like Intro to Python and then directly start coding or he could also read documentations of other Math libraries, similar libraries in C/Fortran, study coding efficiency and thumb rules and then start.
How do you prevent yourself from taking up too little or too much time to begin?
This isn't a one-size-fits-all problem. People should move into research when they're ready to do so, and in consultation with their advisors, when appropriate.
That said, the approach I'd tend to advocate is to ramp down classwork while ramping up research. In that sense, the student controls the pace at which she learns, and can adjust the selection of coursework as time goes on to support or to complement the research work. Moreover, there's generally the assumption on the part of the advisor that the first few months aren't going to feature a lot of useful scientific results; they'll mainly be spent learning techniques and tools and basic concepts and understanding.
So the way to figure out if one is ready to start research is by doing some "low-hanging fruit" problems: if the student can handle the basics, then she can start moving on to the rest. If not, then at least she has a better handle on what she needs.