It is absolutely no secret that the imperial system is kinda dumb. Like, did you know that the foot was divided into ten "thumb-lengths" back in ye olden dayse? Heck, there are records showing that inches were defined by the width of a man's thumb rather than the length of it (that are presumably incredibly precise for the time!). And all in all, it turns out that feet are defined by twelve times an inch, not the other way around.
And the mile? Five thousand two hundred and eighty feet??? What kinda number is that? Y'know, the word "mile" comes from latin "mille," meaning thousand. The mile was "a thousand paces" as the Romans defined it. Apparently it was five thousand feet back in the British 1500s, based on the furlong measurement, which itself was defined by the German foot! We can all blame Queen Elizabeth for taking our clean and easy 5K and ruining it, as she decided (for no apparent reason) that a furlong just had to be 660 feet instead of 625, so they had to make the foot an even shorter measurement, tacking 280 onto the mile.
Shoutouts to nautical miles, though. In feet they make no sense (6,076.11549), but it's supposed to be based on one arcminute of the Earth's circumference, meaning 1/60th of a degree: much cleaner!
But anyways, I'm getting ahead of myself. Let's get back to the kings of the world of measurements: the Metric System.
Currently, the Meter is an SI unit, meaning Standard International or International System. There are seven of these units (Meter, Kilogram, Ampere, Kelvin, Mole, and Candela), all of which are based on a bunch of really confusing numbers (Speed of Light, Planck Constant, Elementary Charge, Boltzmann Constant, Avagadro Constant, Luminous Efficacy [of 540 THz radiation], and the... uh.... "hyperfine transition frequency of Cs"). I'm not gonna go into all of them, because frankly I couldn't care less about the definition of luminocity of exactly 540 THz in comparison to that one dysfunctional fluorescent bulb that likes to flicker on and off in my dining room.
The Meter was originally (in practical settings) defined by a standardized pendulum's swing (with a period of two seconds), but then gravity and atmospheric pressure and other nasty stuff got in the way of that working, so the definition was swapped. I was actually pleasantly surprised to learn that the new original definition of a meter was - no joke - one ten-millionth of the distance between the Equator and the North Pole assuming a flat surface of a sphere.
It's like - what??? That number is SO CLEAN. To scud if it doesn't work 'cause Earth isn't a perfect sphere in any direction or because there's a slight stretch and compression force based on the tiniest fractions of gravitation between the planet and the sun over the course of our orbit due to the tilt on the axis - I LOVE that. It's such a CLEAN. SCUDDING. NUMBER.
Practical representations, of course, were created. Turns out everything mathematically important is kept in Paris, so the original physical standard of a meter was a platinum bar held there. It was eventually replaced by a series of bars (each made of platinum-iridium: the same stuff still used for physical models today).
Of course, however, no unit gets itself a perfectly clean slate. For a brief period of just over a couple decades, some loser decided to define a meter as wavelength of a specific transition in Krypton-86. What on earth does that mean? That sounds like there're at least, like, twenty prerequisites for measuring that sort of thing! We can thank Einstein for saving our sorry hides (again), though, as it was redefined to be 1/c (speed of light) in 1983. It's stayed that way since, and remains to this day one of the essential units of measurement across the entire world.
That begs the question... how is the speed of light defined?
First of all, let's ignore the whole "we have no clue what the one-way speed of light is" thing. So long as we can arbitrarily decide that 2c/2 = c, then reality as we know it actually works. Y'alls do not want me to get into skepticistic nihilism and start digging through the rabbit hole of universal methodic doubt and cogito ergo sum. The speed of light was originally measured by a pair of scientists' efforts - Ole Roemer and Christiaan Huygens.
Roemer was observing the orbital period of the moon Io around Jupiter, attempting to better discern its period. He was studying it over the course of several years and came across an anomaly: there was a solid delay in the time between eclipse emergence (When Io came out from behind Jupiter, making itself observable) depending on the time of year. The delay could be equated to about eleven minutes overall depending on where Earth was during its orbit. In a stroke of genius, Roemer realized that the only explanation was that there must be a finite speed of light, and the distance between Earth and Jupiter changing throughout the year was delaying the light coming from the moon.
Huygens took Roemer's measurements and did the math, finding the speed of light to be approximately 2.10E8 meters per second. The correct measurement is 2.99E8 m/s (the difference came from an inaccuracy in the measurements themselves). Even with all the limitations of the era (it was the late 1600s), they were quite close to the actual standard. It was quite impressive, actually.
The first most precise calculations were by Simon Newcomb and his prodege Albert Michelson. They took measurements by lattices of mirrors and such, constantly zoning in closer and closer to the actual speed. Michelson first found the speed to be 299,910 ± 50 km/s before joining Newcomb, who narrowed it down to 299,860 ± 30 km/s. The most accurate of Michelson's experiements came out to be about 299,774 ± 11 km/s: a measurement found after his death.
While all those numbers are great and all, they're all based on the day's definition of a meter: one ten-millionth of the Earth's distance between the equator and the North Pole, or one of those platinum-iridium bars that they've probably still got locked up in Paris somewhere. Of course, the speed of light was narrowed on down and down, and eventually fixed to be 299,792,458 m/s at one of those big conferences where people that people decided get to decide stuff decide what stuff is.
But here's the problem:
299,792,458 m/s is defined by the meter, yes?
Do you want to know how it was fixed to that number?
By arbitrarily fixing it to that number...
...and then defining a meter by one over it.
Do you SEE THE PROBLEM???
The speed of light was defined by a meter, but then it suddenly swapped so that the meter became defined by the speed of light! I guess that equals signs go both ways, but that is THE MOST RECURSIVE mathematical phenonema to ever exist! You can't just decide that because one thing is one thing, that the other thing is that one thing too! If socrates is man and man is mortal, then Socrates is mortal... but that doesn't mean that because I'm man and I'm mortal that I'm Socrates! Socrates was a butt-ugly genius who talked too much and--
--kay, actually. Maybe I am Socrates. BUT THAT'S NOT THE POINT.
The point, I think, is that these things really don't matter all that much. At the end of the day, it couldn't matter less to you or me how incredibly precise a meter is in relation to how wide your flatscreen television is. What matters is that your tape measure spans the distance and that the TV fits on your wall. I think it's important to remember that, regardless of how science has been honed to a razor-sharp edge of precision, all these measurements and definitions and units are all arbitrary anyways. People will never be perfect, and while the nanoscopic scale of what the heck a meter is or isn't works great on the papers signed by those folks voting about these thigns, but really couldn't matter less to you or me.
So with all that said, if you're going to come away with one thing, it must be this:
The imperial system is defined in relation to metric.
I hope you remember this destructive act of undoing the next time you convert feet to meters over google.