Typically whenever you look closely at an object with complex behavior, there is a system inside made of smaller, simpler objects interacting to produce the complexity.
You'd expect that at the bottom, the smallest objects would be extremely simple and would follow some single physical law.
But the smallest objects we know of still have pretty complex behavior! So there's probably another layer underneath that we don't know about yet, maybe more than one.
I agree, and I think that your claim is compatible with the comment that you are responding to. Indeed, perhaps it's turtles all the way down and there is systematic complexity upon systematic complexity governing our universe that humanity has been just too limited to experience.
For a historical analogy, classical physics was and is sufficient for most practical purposes, and we didn't need relativity or quantum mechanics until we had instruments that could manipulate them, or that at least experienced them. While I guess that there were still macroscopic quantum phenomena, perhaps they could have just been treated as empirical material properties without a systematic universal theory accounting for them, when instruments would not have been precise enough to explore and exploit predictions of a systematic theory.
The experiments that lead to the invention of quantum theory are relatively simple and involve objects you can touch with your bare hands without damaging them. Some are done in high school, eg the photoelectric effect.
Whereas I did hedge my point regarding macroscopic quantum phenomena, I think that the quantum nature of the photoelectric effect would have been harder to discern without modern access to pure wavelength lighting. But you could still rely on precise optics to purify mixed light I suppose. But without even optics it should be even harder.
All the 19th century experiments that desired monochromatic light, including those that have characterized the photoelectric effect, used dispersive prisms, which separated the light from the Sun or from a candle into its monochromatic components. These are simple components, easily available.
This allowed experiments where the frequency of light was varied continuously, by rotating the prism.
Moreover, already during the first half of the 19th century, it became known that using gas-discharge lamps with various gases or by heating certain substances in a flame you can obtain monochromatic light corresponding to certain spectral lines specific to each substance. This allowed experiments where the wavelength of the light used in them was known with high accuracy.
Already in 1827, Jacques Babinet proposed the replacement of the platinum meter standard with the wavelength of some spectral line, as the base for the unit of length. This proposal has been developed and refined later by Maxwell, in 1870, who proposed to use both the wavelength and the period of some spectral line for the units of length and time. The proposal of Babinet has been adopted in SI in 1960, 133 years later, while the proposal of Maxwell has been adopted in SI in 1983, 113 years later.
So there were no serious difficulties in the 19th century for using monochromatic light. The most important difficulty was that their sources of monochromatic light had very low intensities, in comparison with the lasers that are available today. The low intensity problem was aggravated when coherent light was needed, as that could be obtained only by splitting the already weak light beam that was available. Lasers also provide coherent light, not only light with high intensity, thus they greatly simplify experiments.
> You'd expect that at the bottom, the smallest objects would be extremely simple and would follow some single physical law.
That presupposes that there's a bottom, and that each subsequent layer gets simpler. Neither proposition is guaranteed, indeed the latter seems incorrect since quantum chromodynamics governing the internal structure of the proton is much more complex than the interactions governing its external behavior.
Incompleteness is inherent to our understanding as the universe is too vast and endless for us to ever capture a holistic model of all the variables.
Gödel says something specific about human axiomatic systems, akin to a special relativity, but it generalizes to physical reality too. A written system is made physical writing it out, and never complete. Demonstrates that our grasp of physical systems themselves is always incomplete.
Systems can hypothesize about themselves but they cannot determine why the rules they can learn exist in the first place. Prior states are no longer observable so there is always incomplete history.
Conway's Game of Life can't explain its own origins just itself. Because the origins are no longer observable after they occur.
What are the origins of our universe? We can only guess without the specificity of direct observation. Understanding is incomplete with only simulation and theory.
So the comment is right. We would expect to be able to define what is now but not completely know what came before.
You'd expect that at the bottom, the smallest objects would be extremely simple and would follow some single physical law.
But the smallest objects we know of still have pretty complex behavior! So there's probably another layer underneath that we don't know about yet, maybe more than one.