In the last 100 years, the technological prowess of the human race has undoubtedly been the human race’s greatest achievement.
Albert Einstein famously published the Theory of Relativity in the early 20th century.
Since then, modern-day scientific theory has been the driving force behind everything.
Einstein’s theory of special relativity came to the fore in 1905, building on theoretical results and findings by the likes of Albert A. Michelson, Hendrik Lorentz, and a number of other physicists.
Then, between 1907-15, Einstein developed general relativity, with contributions subsequently made by many others. The final form of general relativity was published the following year.
However, how have Einstein's methodologies changed over the years, and how many are used today by modern scientists and are people still as influenced and they once were by these methodologies?
Building on Einstein’s theories
in 1926, Ronald Fisher, a British statistician, geneticist, and academic, popularised his work upon Randomized design.
This involved bringing in the application of independent variables alongside dependent and control variables.
Fisher’s work on the actual physical design of an experimental process was revolutionary.
It also led towards the massive data collection upon which the modern-day scientific principle is based.
What were the main concerns with the experimental method?
Ronald Fisher’s main concerns with his experimental method were established upon validity, reliability and replicability.
The introduction of proofs and how valid the experimental process was in the modern day is vital in a world now filled with rampant pseudoscience theories.
In 1937 the first placebo experiment was published. This work began to bring to light the effect of us as humans and our influence upon the scientific method.
It also pushed thinking towards the attempt to reduce bias towards the outcome of scientific work.
How did computers revolutionise scientific methodology?
At the end of the Second World War, economies across the globe were experiencing a post-war boom. This also became known as the Golden Age of Capitalism.
The first computer simulation was formed, laying the groundwork for something revolutionary: the marriage between science and technology.
The ability to simulate natural environments while controlling different variables gave scientists a huge opportunity to investigate and collect data upon processes that are infinitely complex.
These infinite outcomes and random possibilities were all then transferred in an easily workable computer programme.
Looking ahead to the future of science
In 2009, a “robot scientist” was developed with machine learning algorithms.
It was able to perform an independent experiment, test hypothesis and interpret findings.
With the progression of AI (artificial intelligence), scientific findings could be done fully independent of human touch.
It’s tough to predict what the future holds, but we’re looking forward to finding out!