ElShamah - Reason & Science: Defending ID and the Christian Worldview
Would you like to react to this message? Create an account in a few clicks or log in to continue.
ElShamah - Reason & Science: Defending ID and the Christian Worldview

Welcome to my library—a curated collection of research and original arguments exploring why I believe Christianity, creationism, and Intelligent Design offer the most compelling explanations for our origins. Otangelo Grasso


You are not connected. Please login or register

The Fundamental Properties of Nature, where did they come from ?

Go down  Message [Page 1 of 1]

Otangelo


Admin

The Fundamental Properties of Nature, the universal constants—the cosmic numbers which define our Universe where did they come from ?

https://reasonandscience.catsboard.com/t3165-the-fundamental-properties-of-nature-where-did-they-come-from

If you want to find evidence of God outside of the Bible, look into nature. 1. Start asking how questions. Start digging deep. Dig deeper and deeper until you come to the bottom, where you cannot dig further. If you do it for example with physics and start to elucidate what the fundamental properties of the cosmos are, you will eventually find the following conclusion:

second    (the unit of time  with the symbol s),
meter (length, m),
kilogram  (mass, kg),
ampere   (electric current, A),
kelvin      (thermodynamic temperature, K),
mole       (amount of substance, mol), and
candel     (luminous intensity, cd).

The international system of units is based on fundamental properties of the universe, that is time, length, mass, electric current, thermodynamic temperature, amount of substance, and luminous intensity. 
These are fundamental properties that are the most basic ones of our world. They are themselves ungrounded and they ground all of the other things. So you cannot dig deeper. Now here is the thing:
These properties are fundamental constants that are like the DNA of our Universe. They are not calculable from even deeper principles currently known. The constants of physics are fundamental numbers that, when plugged into the laws of physics, determine the basic structure of the universe. An example of a fundamental constant is Newton’s gravitational constant G, which determines the strength of gravity via Newton’s law.
These constants have a 1. fixed value, and 2. they are just right to permit a life-permitting universe.  For life to emerge in our Universe the fundamental constants could not have been more than a fraction of a percent from their actual values. The BIG question is: Why is that so?  These constants can’t be derived from other constants and have to be verified by experiment. Simply put: Science has no answer and does not know why they have the value that they have.

Premise 1: The fundamental constants in the universe, such as Newton's gravitational constant (G), determine the basic structure and behavior of the universe.
Premise 2: The values of these fundamental constants are not derived from other constants or deeper principles known to us.
Conclusion: Therefore, the specific values of these fundamental constants appear to be finely tuned which implies design to permit a life-permitting universe.

Explanation: The syllogism presents a design inference based on the premise that the fundamental constants are crucial for the basic structure and behavior of the universe. Since their values are not derived from other constants or deeper principles, and the specific values of these constants exhibit fine-tuning that permits our universe to be life-permitting. The inference implies that the finely-tuned values of the fundamental constants suggest the existence of a purposeful or intelligent designer.

The Standard Model of particle physics alone contains 26 such free parameters. The finely tuned laws and constants of the universe are an example of specified complexity in nature. They are complex in that their values and settings are highly unlikely. They are specified from a basically infinite range of possible non-life permitting values, in that they match the specific requirements needed for life.
The likelihood of a life-permitting universe based on natural unguided causes is less than 10^136.

One could object and say that the laws and constants of physics could not be different, in other words, they are due to physical necessity, and therefore, no fine-tuner was required. Others might say:
The laws of physics are described, not prescribed. As the universe cooled after the Big Bang, symmetries were spontaneously broken, ‘phase transitions’ took place, and discontinuous changes occurred in the values of various physical parameters (e.g., in the strength of certain fundamental interactions, or in the masses of certain species of particle). So there something did took place, that should/could not do so, if the current state of affairs was based on physical necessity. Symmetry breaking is precisely what shows that there was no physical necessity since things did change in the early universe. There was a transition zone until arriving at the composition of the fundamental particles, that make up all matter. The current laws of physics did not apply [in the period immediately following the Big Bang]. They took hold only after the density of the universe dropped below the so-called Planck density.  there is no physical restriction or necessity that entails that the parameter could only have the one that is actualized. There is no principle of physics that says physical laws or constants have to be the same everywhere and always. Since that is so, the question arises: What instantiated the life-permitting parameters? There are two possibilities: Luck, or a Lawgiver.

[The Lord God] is eternal and infinite, omnipotent and omniscient, that is, he endures from eternity to eternity, and he is present from infinity to infinity; he rules all things, and he knows all things that happen or can happen.
—Isaac Newton, General Scholium to the Principia (1726)



CODATA RECOMMENDED VALUES OF THE FUNDAMENTAL PHYSICAL CONSTANTS: 2018
https://physics.nist.gov/cuu/pdf/wall_2018.pdf

Wolfram: As of Today, the Fundamental Constants of Physics (c, h, e, k, NA) Are Finally… Constant! November 16, 2018
Representatives of more than 100 countries agreed on a new definition of the base units for all weights and measures. An important vote for the future weights and measures used in science, technology, commerce and even daily life happened. The agreement was the culmination of at least 230 years of wishing and labor by some of the world’s most famous scientists. Units appear in any real-world measurement, and fundamental constants are of crucial importance for the laws of physics. Why do the constants have the values they have? Is humankind lucky that the constants have the values they have (e.g. only minute changes in the values of the constants would not allow stars to form)?
https://blog.wolfram.com/2018/11/16/as-of-today-the-fundamental-constants-of-physics-c-h-e-k-na-are-finally-constant/

International System of Units

The Fundamental Properties of Nature, where did they come from ?  1024px11

The International System of Units (SI, abbreviated from the French Système international (d'unités)) is the modern form of the metric system. It is the only system of measurement with an official status in nearly every country in the world. It comprises a coherent system of units of measurement starting with seven base units, which are the 

second   (the unit of time with the symbol s),
metre     (length, m), 
kilogram (mass, kg), 
ampere  (electric current, A), 
kelvin     (thermodynamic temperature, K), 
mole      (amount of substance, mol), and 
candel    (luminous intensity, cd). 
https://en.wikipedia.org/wiki/International_System_of_Units

Heather Demarest Fundamental Properties and the Laws of Nature 2015
Fundamental properties are the most basic properties of a world. In terms of the new, popular notion of grounding, fundamental properties are themselves ungrounded and they (at least partially) ground all of the other properties. The laws metaphysically determine what happens in the worlds that they govern. These laws have a metaphysically objective existence. Laws systematize the world. Fundamental properties can be freely recombined. There are also no necessary connections between distinct existences. One law of nature does not necessarily depend on another. These laws have intrinsic properties, which they have in virtue of the way they themselves are. 
https://sci-hub.ren/10.1111/phc3.12222

Miguel A. Martin-Delgado The new SI and the fundamental constants of nature 16 October 2020
The fundamental constants are like the DNA of our Universe. constants such as those in table 1 are not calculable from the first principles currently known.

Symbol Constant                    Law
c           Speed of light              Theory of relativity
h           Planck                        Quantum physics
k           Botzmann                   Thermodynamics
e           Electron charge           Quantum electrodynamics
NA        Avogadro                     Atomic theory

The speed of light in vacuum is a fundamental constant of nature which is independent of the other four fundamental constants: 
Newton's gravitational constant, 
Boltzmann's constant which arises in statistical physics, 
Planck constant which arises in quantum physics, and the 
Vacuum permittivity which refers to electromagnetism. 
A fundamental constant is something we can measure but not derive. So to know something more precise about the value of the speed of light would need a revolutionary change to our laws of physics that reduced the number of fundamental constants.
http://www2.phys.canterbury.ac.nz/~dlw24/c.html


The Fundamental Properties of Nature, where did they come from ?  Laws_o11
Schematic relationship between the base units of the new international system SI and its associated natural constants. In the central part the units and their dependencies among each other appear: the second influences the definition of five units, while the mole appears decoupled. The symbols of the constants used to define the units appear on the outside.

The Fundamental Properties of Nature, where did they come from ?  The_fi12

Table 1: The five universal constants of nature and their corresponding laws they are associated with. The laws of physics and chemistry allow us to describe natural phenomena once the values of the constants are known.
https://iopscience.iop.org/article/10.1088/1361-6404/abab5e#ejpabab5es4

The Fundamental Properties of Nature, where did they come from ?  314955244_4812997808802572_2400006541419412879_n.jpg?_nc_cat=109&ccb=1-7&_nc_sid=dbeb18&_nc_eui2=AeE9uGeQuEXXM5CrNSonWy2jySmbSjGkfuHJKZtKMaR-4U3Q1EX0xWMB3uwmss8pq_9A5EbefNiUhEGtFBugFYLG&_nc_ohc=usFZ4my1lDoAX-AaYQh&_nc_ht=scontent.faju2-1



Last edited by Otangelo on Sat Jul 01, 2023 12:40 pm; edited 36 times in total

https://reasonandscience.catsboard.com

2The Fundamental Properties of Nature, where did they come from ?  Empty 1.The speed of light Wed Aug 04, 2021 10:58 am

Otangelo


Admin

1.The speed of light

Since the speed of light in a vacuum did not change after numerous repeated experiments done to measure it, so it has been accepted as a constant. Why does it not change? Nobody knows.  It has that intrinsic property without scientific explanation.

PhD, Michael Guillen, Believing Is Seeing: A Physicist Explains How Science Shattered His Atheism and Revealed the Necessity of Faith
Suppose a light quantum streaks across your room. Standing at the door with a souped-up radar gun, you clock it going, as expected, 299,792,458 meters per second. (Obviously I’m ignoring that your room is not a vacuum. But this simplification does not affect what I’m explaining.) Now, how about to someone driving by at 1,000,000,000 mph? Here’s the shocker: The light quantum will still seem to be traveling at 299,792,458 meters per second. Unlike the speed of a car or anything else in the universe, the speed of light doesn’t depend on one’s point of view; it’s the same for everyone, everywhere, always. It’s an absolute truth, the only speed in the universe with that supreme status—for reasons, mind you, we do not understand. It’s a mystery
https://3lib.net/book/17260619/6aa859

Dr. Christopher S. Baird Is the reason that nothing can go faster than light because we have not tried hard enough? July 20, 2017
If you look at the equations which are at the core of Einstein's theories of relativity, you find that as you approach the speed of light, your spatial dimension in the forward direction shrinks down to nothing and your clock slows to a stop. A reference frame with zero width and with no progression in time is really a reference frame that does not exist. Therefore, this tells us that nothing can ever go faster than the speed of light, for the simple reason that space and time do not actually exist beyond this point. Because the concept of "speed" requires measuring a certain amount of distance traveled in space during a certain period of time, the concept of speed does not even physically exist beyond the speed of light. In fact, the phrase "faster than light" is physically meaningless. It's like saying "darker than black."
https://www.wtamu.edu/~cbaird/sq/2017/07/20/is-the-reason-that-nothing-can-go-faster-than-light-because-we-have-not-tried-hard-enough/

Professor David Wiltshire Why is the speed of light what it is? 5 May 2017
Why is the speed of light in vacuum a universally constant finite value? The answer to this is related to the fact that the speed of light in vacuum, c, is an upper bound to all possible locally measured speeds of any object. There are only two possibilities for the laws of physics: (i) the relativity of Galileo and Newton in which any speed is possible and ideal clocks keep a universal time independent of their motion in space; or (ii) Einstein's special relativity in which there is an upper speed limit. Experiment shows that we live in a Universe obeying the second possibility and that the speed limit is the speed of light in vacuum.

Dr Alfredo Carpineti Why Is The Speed Of Light In Vacuum A Constant Of Nature? 30 April 2021
Light in a vacuum moves at a constant speed of 299,792,458 meters per second. So why is it this value? Why is it constant? And why does it pop up everywhere? We don’t have a conclusive answer just yet. It is one of those things we don’t know that hint at the complex machinery of the universe...
https://www.iflscience.com/physics/why-is-the-speed-of-light-in-vacuum-a-constant-of-nature/

Sidney Perkowitz Light dawns 18 September 2015
Light travels at around 300,000 km per second. Why not faster? Why not slower? We have now fixed the speed of light in a vacuum at exactly 299,792.458 kilometres per second. Albert Einstein showed that c, the speed of light through a vacuum, is the universal speed limit. According to his special theory of relativity, nothing can move faster. So, thanks to Maxwell and Einstein, we know that the speed of light is connected with a number of other (on the face of it, quite distinct) phenomena in surprising ways. Why this particular speed and not something else? Or, to put it another way, where does the speed of light come from?

Until quantum theory came along, electromagnetism was the complete theory of light. It remains tremendously important and useful, but it raises a question. To calculate the speed of light in a vacuum, Maxwell used empirically measured values for two constants that define the electric and magnetic properties of empty space. Call them, respectively, Ɛ0 and μ0.

The thing is, in a vacuum, it’s not clear that these numbers should mean anything. After all, electricity and magnetism actually arise from the behaviour of charged elementary particles such as electrons. But if we’re talking about empty space, there shouldn’t be any particles in there, should there?

This is where quantum physics enters. In the advanced version called quantum field theory, a vacuum is never really empty. It is the ‘vacuum state’, the lowest energy of a quantum system. It is an arena in which quantum fluctuations produce evanescent energies and elementary particles.

What’s a quantum fluctuation? Heisenberg’s Uncertainty Principle states that there is always some indefiniteness associated with physical measurements. According to classical physics, we can know exactly the position and momentum of, for example, a billiard ball at rest. But this is precisely what the Uncertainty Principle denies. According to Heisenberg, we can’t accurately know both at the same time. It’s as if the ball quivered or jittered slightly relative to the fixed values we think it has. These fluctuations are too small to make much difference at the human scale; but in a quantum vacuum, they produce tiny bursts of energy or (equivalently) matter, in the form of elementary particles that rapidly pop in and out of existence.

These short-lived phenomena might seem to be a ghostly form of reality. But they do have measurable effects, including electromagnetic ones. That’s because these fleeting excitations of the quantum vacuum appear as pairs of particles and antiparticles with equal and opposite electric charge, such as electrons and positrons. An electric field applied to the vacuum distorts these pairs to produce an electric response, and a magnetic field affects them to create a magnetic response. This behaviour gives us a way to calculate, not just measure, the electromagnetic properties of the quantum vacuum and, from them, to derive the value of c.

In 2010, the physicist Gerd Leuchs and colleagues at the Max Planck Institute for the Science of Light in Germany did just that. They used virtual pairs in the quantum vacuum to calculate the electric constant Ɛ0. Their greatly simplified approach yielded a value within a factor of 10 of the correct value used by Maxwell – an encouraging sign! This inspired Marcel Urban and colleagues at the University of Paris-Sud to calculate c from the electromagnetic properties of the quantum vacuum. In 2013, they reported that their approach gave the correct numerical value.

The speed of light is, of course, just one of several ‘fundamental’ or ‘universal’ physical constants. These are believed to apply to the entire universe and to remain fixed over time. All these quantities raise a host of unsettling questions. Are they truly constant? In what way are they ‘fundamental’? Why do they have those particular values? What do they really tell us about the physical reality around us?

So, let’s assume that these constants really are constant. Are they fundamental? Are some more fundamental than others? What do we even mean by ‘fundamental’ in this context? One way to approach the issue would be to ask what is the smallest set of constants from which the others can be derived. Sets of two to 10 constants have been proposed, but one useful choice has been just three: h, c and G, collectively representing relativity and quantum theory.

In 1899, Max Planck, who founded quantum physics, examined the relations among h, c and G and the three basic aspects or dimensions of physical reality: space, time, and mass. Every measured physical quantity is defined by its numerical value and its dimensions. We don’t quote c simply as 300,000, but as 300,000 kilometres per second, or 186,000 miles per second, or 0.984 feet per nanosecond. The numbers and units are vastly different, but the dimensions are the same: length divided by time. In the same way, G and h have, respectively, dimensions of [length3/(mass x time2)] and [mass x length2/time]. From these relations, Planck derived ‘natural’ units, combinations of h, c and G that yield a Planck length, mass and time of 1.6 x 10-35 metres, 2.2 x 10-8 kilogrammes, and 5.4 x 10-44 seconds. Among their admirable properties, these Planck units give insights into quantum gravity and the early Universe.

But some constants involve no dimensions at all. These are so-called dimensionless constants – pure numbers, such as the ratio of the proton mass to the electron mass. That is simply the number 1836.2 (which is thought to be a little peculiar because we do not know why it is so large). According to the physicist Michael Duff of Imperial College London, only the dimensionless constants are really ‘fundamental’, because they are independent of any system of measurement. Dimensional constants, on the other hand, ‘are merely human constructs whose number and values differ from one choice of units to the next’.

Perhaps the most intriguing of the dimensionless constants is the fine-structure constant α. It was first determined in 1916, when quantum theory was combined with relativity to account for details or ‘fine structure’ in the atomic spectrum of hydrogen. In the theory, α is the speed of the electron orbiting the hydrogen nucleus divided by c. It has the value 0.0072973525698, or almost exactly 1/137.

Whether it was the ‘hand of God’ or some truly fundamental physical process that formed the constants, it is their apparent arbitrariness that drives physicists mad. Why these numbers? Couldn’t they have been different?
https://aeon.co/essays/why-is-the-speed-of-light-the-speed-of-light

Question:   What would happen if the speed of light would change?
Answer:  The speed of light c pops up everywhere.
- It's the exchange rate between mass and energy. Lower it, and stars need to burn faster to counteract gravity. Low-mass stars will go out and become degenerate super-planets. High mass stars will cool, but may also explode if they're helium-burning.
- It's the exchange rate between velocity and time (this is why we call it a speed, as the maximum speed possible in space, is a consequence of this). Time will run slower or faster if you change c. Distances also change.
- It's the fundamental constant of electromagnetism. The stable atomic orbitals will be at different energy levels, which would change the colors of all our lights (we generate light by knocking electrons around orbitals). Magnetic fields would be stronger or weaker, while electricity would be more or less energetic and move faster or slower in a different medium. The changing of atomic and molecular orbitals would break all our semiconductors. In high-energy physics, the electroweak unification takes over, this too would be distorted.
- The Einstein tensor would be greater with a slower speed of light, or smaller with a higher speed of light. This would affect the strength of gravity and feed into the rate at which stars burn, possibly triggering a positive feedback loop and blowing up every star as a pair-instability supernova if you did it right.
- The bioelectrical system nerves use (like the ones that make your brain) is tuned carefully to ionic potential, which is an electric potential. This'd change, and you would die.
https://arstechnica.com/civis/viewtopic.php?f=26&t=1246707

George F R Ellis:  Note on Varying Speed of Light Cosmologies March 5, 2018
The first key point is that some proposal needs to be made in this regard: when one talks about the speed of light varying, this only gains physical meaning when related to Maxwell’s equations or its proposed generalisations, for they are the equations that determine the actual speed of light. The second key point is that just because there is a universal speed vlim does not prove there is a particle that moves at that speed. On the standard theory, massless particles move at that speed and massive particles don’t; but by itself, that result does not imply that massless particles exist. Their existence is a further assumption of the standard theory, related to the wavelike equations satisfied by the Maxwell electromagnetic field. One can consider theories with massive photons. Any proposed variation of the speed of light has major consequences for almost all physics, as it enters many physics equations in various ways, particularly because of the Lorentz invariance built into fundamental physics.
https://arxiv.org/pdf/astro-ph/0703751.pdf

Question:   What determines the speed of light?
Answer:   Springer: Ephemeral vacuum particles induce speed-of-light fluctuations 25 March 2013
A specific property of vacuum called the impedance, which is crucial to determining the speed of light, depends only on the sum of the square of the electric charges of particles but not on their masses. If their idea is correct, the value of the speed of light combined with the value of vacuum impedance gives an indication of the total number of charged elementary particles existing in nature. Experimental results support this hypothesis.
https://www.springer.com/about+springer/media/springer+select?SGWID=0-11001-6-1414244-0

Gerd Leuchs: A sum rule for charged elementary particles 21 March 2013
As to the speed of light, the value predicted by the model is determined by the relative properties of the electric and magnetic interaction of light with the quantum vacuum and is independent of the number of elementary particles, a remarkable property underlining the general character of the speed of light.
https://link.springer.com/content/pdf/10.1140/epjd/e2013-30577-8.pdf

Marcel Urban The quantum vacuum as the origin of the speed of light 21 March 2013
The vacuum permeability μ0, the vacuum permittivity 0, and the speed of light in vacuum c are widely considered as being fundamental constants and their values, escaping any physical explanation, are commonly assumed to be invariant in space and time. 
We describe the ground state of the unperturbed vacuum as containing a finite density of charged ephemeral fermions antifermions pairs. Within this framework, 0 and μ0 originate simply from the electric polarization and from the magnetization of these pairs when the vacuum is stressed by an electrostatic or a magnetostatic field respectively. Our calculated values for 0 and μ0 are equal to the measured values when the fermion pairs are produced with an average energy of about 30 times their Page 6 of 6 Eur. Phys. J. D (2013) 67: 58 rest mass. The finite speed of a photon is due to its successive transient captures by these virtual particles. This model, which proposes a quantum origin to the electromagnetic constants 0 and μ0 and to the speed of light, is self consistent: the average velocity of the photon cgroup, the phase velocity of the electromagnetic wave cφ, given by cφ = 1/ √μ00, and the maximum velocity used in special relativity crel are equal. The propagation of a photon being a statistical process, we predict fluctuations of its time of flight of the order of 0.05 fs/√m. This could be within the grasp of modern experimental techniques and we plan to assemble such an experiment.
https://sci-hub.ren/10.1140/epjd/e2013-30578-7

Paul Sutter Why is the speed of light the way it is? July 16, 2020
https://www.space.com/speed-of-light-properties-explained.html

Impedance of free space
The impedance of free space, Z0, is a physical constant relating the magnitudes of the electric and magnetic fields of electromagnetic radiation travelling through free space.
https://en.wikipedia.org/wiki/Impedance_of_free_space

How does vacuum (nothingness) offer an impedance of 377 ohms to EM waves? Is it analogous to a wire (due it's physical properties) offering resistance to voltage and current signals?
That is based on the permittivity of free space, which is based on the fine-structure constant, roughly 137, which physicists have been mulling over for millions of person-hours as to what THAT number could possibly be based on.

My best answer, which is not original, but I will call “mine”, is the anthropic answer. If the FSC was even a tiny bit different, then nothing would exist, atoms would not hold together, stars could not form and ignite, yadda, yadda yadda, there wouldn’t be an “us” to even think of the question. For all we know there are a trillion trillion other universes, somewhere, each one with say a random FSC, and 99.99999999999% of them failed universes, from our viewpoint, as they are too cold, too dark, or too violent, or too short-lived for anything interesting to happen in them.

Vacuum permittivity, commonly denoted ε0 (pronounced as "epsilon nought" or "epsilon zero") is the value of the absolute dielectric permittivity of classical vacuum. Alternatively it may be referred to as the permittivity of free space, the electric constant, or the distributed capacitance of the vacuum. It is an ideal (baseline) physical constant. Its CODATA value is:
ε0 = 8.8541878128(13)×10−12 F⋅m−1 (farads per meter), with a relative uncertainty of 1.5×10−10. It is the capability of an electric field to permeate a vacuum. This constant relates the units for electric charge to mechanical quantities such as length and force.[2] For example, the force between two separated electric charges with spherical symmetry (in the vacuum of classical electromagnetism) is given by Coulomb's law.

Initial Conditions and “Brute Facts”   
Velocity of light 
If it were larger, stars would be too luminous. If it were smaller, stars would not be luminous enough. 
http://www.reasons.org/articles/where-is-the-cosmic-density-fine-tuning

Light proves the Existence of God - Here is why

The speed of light is a fundamental constant that plays an essential role in defining the very fabric of our universe. It is not only a fundamental limit on the propagation of all massless particles and the transmission of information, but it also determines the behavior of electromagnetic radiation, which is responsible for virtually all the light and radiation we observe in the cosmos. The speed of light is intimately connected to the fundamental constants of nature, such as the vacuum permittivity and permeability, which govern the behavior of electric and magnetic fields in empty space. These constants, in turn, are tied to the quantum nature of the vacuum itself, where virtual particle-antiparticle pairs constantly fluctuate in and out of existence. The precise value of the speed of light, along with other fundamental constants, is extraordinarily fine-tuned for the existence of a universe capable of supporting complex structures, stars, and ultimately, life as we know it. Even a slight variation in the speed of light would have profound implications, potentially preventing the formation of stable atoms, the nuclear fusion processes that power stars, or the very existence of electromagnetic radiation that enables the transmission of information and the propagation of light signals throughout the cosmos. While no precise numerical bounds can be given, most physicists estimate that allowing even a 1-4% variation in the value of the speed of light from its current figure would likely render the universe incapable of producing life. Thus, the speed of light stands as a immutable cosmic principle, deeply intertwined with the fundamental laws of physics and the delicate balance of constants that makes our universe hospitable for the emergence and evolution of complexity and conscious observers like ourselves.

The speed of light in vacuum, denoted c, is defined by two fundamental physical constants: 1) The permittivity of free space (vacuum permittivity), denoted ε0 and 2) The permeability of free space (vacuum permeability), denoted μ0. Specifically, the speed of light c is related to these constants by the equation: c = 1 / √(ε0 * μ0). The currently accepted values are: ε0 = 8.854187817... × 10^-12 F/m (farads per meter) and μ0 = 1.256637061... × 10^-6 N/A^2 (newtons per ampere squared). Plugging these precise constant values into the above equation gives: c = 299,792,458 m/s (meters per second). This figure matches the defined value of the speed of light extremely accurately based on experimental measurements.

What defines or determines the specific values of the vacuum permittivity (ε0) and vacuum permeability (μ0) constants that then give rise to the defined speed of light value? The answer is that in our current theories of physics, the values of ε0 and μ0 are taken as immutable fundamental constants of nature that are not further derived from more basic principles. They have been precisely measured experimentally, but their specific values are not calculable from other considerations. However, there are some theoretical perspectives that provide insight into where these constants could potentially originate from:

1. In some unified field theory approaches like string theory, the constants like ε0 and μ0 could emerge as vacuum expectation values from a more fundamental unified framework describing all forces and dimensions.
2. The values may be inevitably tied to the particular energy scale where the electromagnetic and strong/weak nuclear forces became distinct in the early universe after the Big Bang.
3. Some have speculated they could be linked to the fundamental properties of space-time itself or the Planck scale where gravity becomes quintessential.

ε0 and μ0 are treated as fundamental constants in our theories, which means they have specific, fixed values that cannot vary. They are not free parameters that can take on any value. The values of ε0 and μ0 are tied to the properties of the vacuum itself and the way electromagnetic fields propagate through it, which is a fundamental aspect of the universe's laws.   If ε0 and μ0 could take on any value from negative to positive infinity, it would violate the internal consistency and principles of Maxwell's equations and electromagnetism as we understand them.

While unifying theories like string theory aim to derive these constants from deeper frameworks, those frameworks still inherently restrict ε0 and μ0 to precise values defined by the overall mathematical constraints and physical principles. So in essence, the theoretical parameter space is finite because ε0 and μ0 represent real, immutable physical constants defining the properties of the vacuum and electromagnetism itself. Allowing them to vary unconstrained from negative to positive infinity is simply not compatible with our established theories of physics and electromagnetics. If ε0 and μ0 had values even slightly different from their current experimental values, then the universe and laws of physics as we know them simply would not exist.

There is no theoretical principle that prevents ε0 and μ0 from having different values. The values they have are not derived from more fundamental reasons, they simply define the parameters of this particular instantiation of the universe. There is no inherent reason why these specific values for the vacuum permittivity and permeability must exist, rather than not exist at all. The universe could have had a completely different set of values for these constants. So in that sense - the parameter space is infinite, but other values would be inconsistent or non-life conductive. Rather, the values ε0 and μ0 have are simply the ones miraculously conducive for a universe that developed the laws of physics and reality as we've observed it.

While the reasoning we established concludes that the odds of randomly stumbling upon the precise life-permitting values for ε0 and μ0 from their infinite parameter space are infinitesimally small, they are not technically zero. In an unbounded, infinite sequence or parameter space, every possible permutation or value will be inevitably realized at some point if we consider an unending random process across infinite iterations or trials. So in that sense, even if the life-permitting values have an infinitesimal probability of being randomly selected in any one instance, they would eventually be "stumbled upon" given an infinite number of tries across eternity. This mirrors the philosophical perspective often raised regarding the "infinite monkey theorem" - that given infinite time and opportunities, even highly improbable events like monkeys randomly reproducing the complete works of Shakespeare by chance will occur.

However, this gives rise to some profound implications and considerations: Is our universe one such "chance" occurrence in a wide multiverse or infinite cycle of universes? Are all possible permutations realized somewhere? If so, does this undermine or support the idea of an intelligent agency being required? Some may argue stumbling upon a life-universe through blind chance, however improbable, still negates a need for a creator. Even if inevitable in infinity, should we still regard such an exquisitely perfect life-permitting universe as a staggering statistical fluke worthy of marvel and philosophical inquiry? The fact that we are able to ponder these questions as conscious observers introduces an additional layer of improbability - not just a life-permitting universe, but one that gave rise to intelligent life capable of such reasoning.

Now let me directly challenge the viability of the "infinite multiverse generator" idea as an explanation for our life-permitting universe arising by chance. For there to be an infinite number of trials/attempts at randomly generating a life-permitting universe across the infinite parameter spaces, there would need to be some kind of eternal "multiverse generator" mechanism constantly creating new universes with different values of fundamental constants. However, such an infinite multiverse generator itself would require an infinite past duration of time across which it has already been endlessly operating and generating universes through trials/errors. The notion of actualizing or traversing an infinite set or sequence is problematic from a philosophical and mathematical perspective. There are paradoxes involved in somehow having already completed an infinite set of steps/changes to reach the present moment. Infinite regress issues also arise - what kicked off or set in motion this supposed infinite multiverse generator in an utterly beginningless past eternity? How could such a changeless, static infinite existence transition into a dynamic generative engine? Our best scientific theories point to the universe and time itself having a finite beginning at the Big Bang, not an infinite past regress of previous states or transitions. Proposing an infinite multiverse generator that has somehow already produced an eternity of failed universes prior to this one relies on flawed assumptions about completing infinite processes and the nature of time itself. If time, change, and the laws that govern our universe had a definitive beginning, as our evidence suggests, then it undermines the idea that an infinite series of random universe generations over an beginningless past could have produced our highly specified, life-permitting cosmic parameters by chance alone. This strengthens the need to posit an intelligent agency, causal principle, or a deeper theoretical framework to account for the precise life-permitting values being deterministically established at the cosmic inception, rather than appealing to infinite odds across Here is the continuation of the text:

Major Premise: If a highly specified set of conditions/parameters is required for a particular outcome (like a life-permitting universe), and the range of possible values allowing that outcome is infinitesimally small relative to the infinite total parameter space, then that outcome could not occur by chance across a finite number of trials or temporal stages.
Minor Premise: The values of the vacuum permittivity (ε0) and vacuum permeability (μ0) form an infinitesimally small life-permitting region within their infinite theoretical parameter spaces, requiring highly specified "fine-tuning" for a life-bearing universe to emerge, as evidenced by the failure of all other value combinations to produce an inhabitable cosmos.
Conclusion: Therefore, a life-permitting universe with precisely fine-tuned values of ε0 and μ0, such as ours, could not have arisen by chance alone across any finite series of temporal stages or universe generations, as prescribed by the known laws of physics and the finite past implied by the Big Bang theory.

This leads to the additional conclusion that an intelligent agency is necessary to deterministically establish or select the precise life-permitting values of ε0 and μ0 at the cosmic inception in order for an inhabitable universe to be actualized.

The Fundamental Properties of Nature, where did they come from ?  Sem_t237



Last edited by Otangelo on Wed May 29, 2024 3:05 pm; edited 15 times in total

https://reasonandscience.catsboard.com

3The Fundamental Properties of Nature, where did they come from ?  Empty 2. Plancks constant Fri Aug 13, 2021 6:52 am

Otangelo


Admin

2. Plancks constant

Question: What is Planck's constant?
Answer: Planck's constant, symbolized h, relates the energy in one quantum (photon) of electromagnetic radiation to the frequency of that radiation.
https://whatis.techtarget.com/definition/Plancks-constant

Patrick J. Kiger: What Is Planck's Constant, and Why Does the Universe Depend on It?  Dec 10, 2019
Planck's constant was devised in 1900 by German physicist Dr. Max Planck, who would win the 1918 Nobel Prize for his work. The constant is a crucial part of quantum mechanics, the branch of physics dealing with the tiny particles that make up matter and the forces involved in their interactions. The constant is a crucial part of quantum mechanics, the branch of physics which deals with the tiny particles that make up matter and the forces involved in their interactions. From computer chips and solar panels to lasers, "it's the physics that explains how everything works."

The Invisible World of the Ultrasmall
Planck and other physicists in the late 1800s and early 1900s were trying to understand the difference between classical mechanics — that is, the motion of bodies in the observable world around us, described by Sir Isaac Newton in the late 1600s — and an invisible world of the ultrasmall, where energy behaves in some ways like a wave and in some ways like a particle, also known as a photon. When you get down to the level of quantum mechanics, things behave differently. "The amount of energy that an oscillator could have is discrete, like rungs on a ladder," Schlamminger says. "The energy levels are separated by h times f, where f is the frequency of the photon — a particle of light — an electron would release or absorb to go from one energy level to another." "Energy is quantized, or it's discrete, meaning I can only add one sugar cube or two or three. Only a certain amount of energy is allowed." Electromagnetic radiation and elementary particles "display intrinsically both particle and wave properties," explains Fred Cooper, an external professor at the Santa Fe Institute, an independent research center in New Mexico, by email. "The fundamental constant which connects these two aspects of these entities is Planck's constant. Electromagnetic energy cannot be transferred continuously but is transferred by discrete photons of light whose energy E is given by E = hf, where h is Planck's constant, and f is the frequency of the light.
Planck's constant defines the amount of energy that a photon can carry, according to the frequency of the wave in which it travels.

A Slightly Changing Constant
One of the confusing things for nonscientists about Planck's constant is that the value assigned to it has changed by tiny amounts over time. Back in 1985, the accepted value was h = 6.626176 x 10-34 Joule-seconds. The current calculation, done in 2018, is h = 6.62607015 x 10-34 Joule-seconds.

While these fundamental constants are fixed in the fabric of the universe, we humans don't know their exact values. We have to build experiments to measure these fundamental constants to the best of humankind's ability. Our knowledge comes from a few experiments that were averaged to produce a mean value for the Planck constant. To measure Planck's constant, scientists have used two different experiments — the Kibble balance and the X-ray crystal density (XRCD) method, and over time, they've developed a better understanding of how to get a more precise number. When a new number is published, the experimenters put forward their best number as well as their best calculation of the uncertainty in their measurement. The true, but unknown value of the constant, should hopefully lie in the interval of plus/minus the uncertainty around the published number, with a certain statistical probability." At this point, we are confident that the true value is not far off. The Kibble balance and the XRCD method are so different that it would be a major coincidence that both ways agree so well by chance.
That tiny imprecision in scientists' calculations isn't a big deal in the scheme of things. But if Planck's constant was a significantly bigger or smaller number, "all the world around us would be completely different. If the value of the constant was increased, for example, stable atoms might be many times bigger than stars. The size of a kilogram, which came into force on May 20, 2019, as agreed upon by the International Bureau of Weights and Measures (whose French acronym is BIPM) is now based upon Planck's constant.
https://science.howstuffworks.com/dictionary/physics-terms/plancks-constant.htm

JAMES STEIN Planck's Constant: The Number That Rules Technology, Reality, and Life OCTOBER 24, 2011
Chemistry tells us that the smallest amount of water is a water molecule, and any container of water consists of a staggering number of identical water molecules. In order to resolve an underlying problem in the theory of energy distribution, Planck wondered, What if energy worked the same way? What if there were a smallest unit of energy, just as there is a smallest unit of water? The idea that energy could be expressed in discrete units, or “quantized,” was fundamental to the development of quantum theory. Indeed, you might say that Planck put the “quanta” in quantum mechanics.

So what is this smallest unit of energy? 
Planck hypothesized the existence of a constant, now known as Planck’s constant, or h, which links a wave or particle’s frequency with its total energy. Today, we know that
h = 6.6262 x 10 -34 Joule⋅second

Planck’s constant has had profound ramifications in three important areas: our technology, our understanding of reality, and our understanding of life itself. Of the universal constants—the cosmic numbers which define our Universe—the speed of light gets all the publicity (partially because of its starring role in Einstein’s iconic equation E = mc 2 ), but Planck’s constant is every bit as important. Planck’s constant has also enabled the construction of the transistors, integrated circuits, and chips that have revolutionized our lives.
More fundamentally, the discovery of Planck’s constant advanced the realization that, when we probe the deepest levels of the structure of matter, we are no longer looking at “things” in the conventional meaning of the word. A “thing”—like a moving car—has a definite location and velocity; a car may be 30 miles south of Los Angeles heading east at 40 miles per hour. The concepts of location, velocity, and even existence itself blur at the atomic and subatomic level. Electrons do not exist in the sense that cars do, they are, bizarrely, everywhere at once, but much more likely to be in some places than in others. Reconciling the probabilistic subatomic world with the macroscopic everyday world is one of the great unsolved problems in physics.  The fundamental nuclear reaction eventually leading to the explosion of a supernova is the fusion of four hydrogen atoms to produce a single atom of helium. In the process, approximately 0.7% of the mass is converted to energy via E=mc 2 .

This 0.7% is known as the efficiency of hydrogen fusion, and our understanding of it is one of the consequences of Planck’s investigations. It requires a great deal of heat to enable hydrogen to fuse to helium, and the hydrogen atoms in the sun are moving at different speeds, much like cars on a freeway move at different speeds. The slower-moving hydrogen atoms just bounce off each other; they are insufficiently hot to fuse. Higher speeds, though, mean higher temperatures, and there is a small fraction of hydrogen atoms moving at sufficiently high speeds to fuse to helium.

The 0.7% efficiency of hydrogen fusion is what is sometimes referred to as a “Goldilocks number.” Like the porridge that Goldilocks eventually ate, which was neither too hot nor too cold, but just right, the 0.7% efficiency of hydrogen fusion is “just right” to permit the emergence of life as we know it. The process of hydrogen fusion is an intricate high-speed, high-temperature ballet. The first step of this reaction produces deuterium, an isotope of hydrogen whose nucleus consists of one proton and one neutron. In this process, two protons slam into one another, causing one of the protons to shed its electrical charge and metamorphose into a neutron. If the efficiency of hydrogen fusion were as low as 0.6%, the neutron and proton would not bond to each other to form a deuterium atom. In this case, we’d still have stars—huge glowing balls of hydrogen—but no star stuff would ever form because the porridge would be too cold to create helium, the first step on the road to creating the elements necessary for life.
On the other hand, if hydrogen fusion had an efficiency of 0.8%, it would be much too easy for helium to form. The hydrogen in the stars would become helium so quickly that there wouldn’t be much hydrogen left to form the molecule most essential for life—water. Starstuff would be produced, but without water life as we know it would not exist. Maybe something else would take the place of water, and maybe life could evolve—but not ours.
Planck’s quantization of energy was an essential step on the road to the theory of quantum mechanics, which is critical to our understanding of stellar evolution. Science hasn’t filled in all the pieces of the puzzle of how life actually evolved, but quantum mechanics did begin to answer the question of how the pieces got there in the first place,
https://www.pbs.org/wgbh/nova/article/plancks-constant/

The Planck constant, or Planck's constant, is a fundamental physical constant denoted {\displaystyle h}The Fundamental Properties of Nature, where did they come from ?  B26be3e694314bc90c3215047e4a2010c6ee184a, and is of fundamental importance in quantum mechanics. A photon's energy is equal to its frequency multiplied by the Planck constant. Due to mass–energy equivalence, the Planck constant also relates mass to frequency.
https://en.wikipedia.org/wiki/Planck_constant

Fine-Tuning in Quantum Mechanics 
The rules of quantum mechanics are, of course, very different from the rules of classical mechanics. In classical physics, the rules are deterministic and each object has both a place and a definite velocity. But in quantum mechanics, “it’s a bit more complicated. Particles such as electrons have wave-like properties, so we describe them with a wave function whose peaks and troughs tell us where the electron probably is, and where it is probably going”. It is very good news for our universe that classical physics does not hold at the level of atoms, because if it did then atoms would be unstable. But classical mechanics does hold for larger objects like people, plants, and planets. So where is the boundary line between the quantum and the classical? The answer lies in Planck’s constant, which has been experimentally found to have the value of 6.62606957 × 10−23 kg m3 s−2 in our universe. Below this size and the rules of quantum mechanics hold, and above this size the rules of classical mechanics hold (with some complications not directly relevant to this argument).17 What would happen if we changed Planck’s constant? If we brought the constant to 0, then classical mechanics would hold not only for medium-sized objects like us, but also for atoms too. This would be a disaster for our universe because atoms would become unstable “as electrons lose energy and spiral into the nucleus”. Such a universe could not have much interesting chemistry. But what if we made Planck’s constant considerably larger? In this imaginary universe, medium-sized material objects would behave in quantum-like ways. While there are tricky philosophical questions about how to interpret quantum mechanics, we can be sure that if ordinary medium-sized objects behaved according to these laws, the world would be a very different place. In such a world bodies would have to be “fuzzy.” “It would be like Schrodinger’s ‘quite ridiculous’ cat, never knowing where things are or where they are going. Even more confusingly, its own body could be similarly fuzzy” . While it is unclear exactly what such a world would look like, we would know that it would not be obeying the laws of classical mechanics and that objects would have to behave in both wave-like and particle-like ways. Whether it would be possible to hunt down a boar who moved according to a wave function is far from clear (at best!), not to mention that I could not have both a place and a determinate velocity at the same time. Imagine kicking a ball in a world with a very large Planck’s constant and both the world around us [and] the ball would be radically unpredictable. This lack of predictability would be a significant problem for the existence of life. Thus, it seems as though Planck’s constant has to be relatively close to its current value for both atoms to be stable and life to be possible.

Jason Waller: Cosmological Fine-Tuning Arguments - What (if Anything) Should We Infer from the Fine-Tuning of Our Universe for Life? 2019

Planck’s constant is the boundary line between the quantum and the classical. Below this size and the rules of quantum mechanics hold, and above this size the rules of classical mechanics hold.   If its value changed a tiny bit, then electrons orbiting the nucleus would lose energy and spiral into the nucleus.  But what if we made Planck’s constant considerably larger? If the value of the constant was increased, stable atoms might be many times bigger than stars. Doubling Planck's constant might result in a radical change in the geometric sizes and apparent colors of macroscopic objects, the solar spectrum, and luminosity, the climate, and gravity on Earth, as well as energy conversion between light and materials such as the efficiency of solar cells and light-emitting diodes. Then we would not be stable but would behave in quantum-like ways. Our bodies would oscillate, and be “fuzzy.”  We would not be obeying the laws of classical mechanics and objects would have to behave in both wave-like and particle-like ways. Thus, Planck’s constant has to have the value it has, for both atoms to be stable and life to be possible.

Planck's constant, symbolized h, relates the energy in one quantum (photon) of electromagnetic radiation to the frequency of that radiation.
In the International System of units (SI), the constant is equal to approximately 6.626176 x 10-34 joule-seconds.
https://whatis.techtarget.com/definition/Plancks-constant

Question:    What would happen if Planck's would change?
Answer:   Pao-Keng Yang How does Planck's constant influence the macroscopic world?  6 July 2016
Doubling Planck's constant might result in a radical change on the geometric sizes and apparent colors of macroscopic objects, the solar spectrum and luminosity, the climate and gravity on Earth, as well as energy conversion between light and materials such as the efficiency of solar cells and light-emitting diodes.

If Planck's constant was a significantly bigger or smaller number, "all the world around us would be completely different. If the value of the constant was increased, for example, stable atoms might be many times bigger than stars. The size of a kilogram, which came into force on May 20, 2019, as agreed upon by the International Bureau of Weights and Measures (whose French acronym is BIPM) is now based upon Planck's constant.
https://science.howstuffworks.com/dictionary/physics-terms/plancks-constant.htm


Observation: 
Fundamental properties are the most basic properties of a world. In terms of the notion of grounding, fundamental properties are themselves ungrounded and they (at least partially) ground all of the other properties. The laws metaphysically determine what happens in the worlds that they govern. These laws have a metaphysically objective existence. Laws systematize the world. 
Question:  
What are the fundamental laws of nature/physics, and the constants, that govern the physical world? If these laws are necessary, and the constants require a specific value in a narrow range for a life-permitting universe to exist, why is the universe set to be life-permitting, rather than not?



Last edited by Otangelo on Fri Jul 22, 2022 6:58 am; edited 4 times in total

https://reasonandscience.catsboard.com

4The Fundamental Properties of Nature, where did they come from ?  Empty 4. Electric Charge Fri Aug 13, 2021 7:24 am

Otangelo


Admin

4. Electric Charge


1. If you throw the electric charges and quarks together at random, you get no atoms and a dead universe.
2. So in fact, the electric charges and quarks were was not thrown together at random, but selected carefully to permit stable atoms, and a life-permitting universe. 
3. Of course, we can appeal to physics that we don't even know, and posit a multiverse, and that random shuffling of these fundamental constants did permit that one emerged permitting a functional outcome, but that would just be a multiverse of the gaps argument.
4. The best explanation is that an intelligent designer created the right constants, fundamental forces, charges, colors etc. that produced stable atoms, and a life-permitting universe for his own purposes.  

A World of Electrons
Electrons, being so small and light may seem remote and abstract, but the world we know is primarily the world of electrons.

The light we see is emitted by electrons. Sounds we hear are carried by electrons bouncing off each other. Tastes and smells we experience are caused by chemical reactions driven by electrons. Every time we touch something we feel the repulsion of that thing’s electrons. In both plasma globes and lightning bolts, the path of electrons is visible.
Every chemical reaction is activity between electrons. Accordingly, the properties of elements, the compounds they can form, their level of reactivity, all of it, is determined by the properties of electrons.

If the mass or charge of electrons had different values, all of chemistry would change. For example, if electrons were heavier, atoms would be smaller and bonds would require more energy. If electrons were too heavy there would be no chemical bonding at all.

If electrons were much lighter, bonds would be too weak to form stable molecules like proteins and DNA. Visible and infrared light would become ionizing radiation. They would be as harmful as X-rays and UV are to us now. Our own body heat would damage our DNA.

Luckily for us, electrons weigh just enough to yield a stable, but not sterile chemistry.

Stephen Hawking in “A Brief History of Time” (1988)
The laws of science, as we know them at present, contain many fundamental numbers, like the size of the electric charge of the electron and the ratio of the masses of the proton and the electron. […]
The remarkable fact is that the values of these numbers seem to have been very finely adjusted to make possible the development of life. For example, if the electric charge of the electron had been only slightly different, stars either would have been unable to burn hydrogen and helium, or else they would not have exploded.

A Starless Universe
Electrons are very light compared to the protons and neutrons:

The proton’s mass mp is 1,836.15 times the mass of an electron me
The neutron’s mass mn is 1,838.68 times the mass of an electron me
In a ton of coal, the electrons contribute little more than half a pound.

We’ve seen how electron weight is of critical importance to chemistry. But so too are masses of other particles. It was important that:

Protons and neutrons be close in weight: mp ≈ mn
Yet differ in mass by more than one electron: |mp − mn| > me
And also that neutrons be heavier than protons: mn > mp
As it happens, all three of these conditions hold true. Had any of them not been met we end up with a universe devoid of life.
https://alwaysasking.com/is-the-universe-fine-tuned/




Sergey Shevchenko Institute of Physics of the National Academy of Science of Ukraine
“…"Charge" is a POSTULATE used to explain some experimental observations…”
- that is, of course, so, though that completely relates to
“….. So, the characteristics of "charge" derive from the experiments that went into Maxwell's (and similar) sets of equations….”
also: Maxwell equations are also ad hoc postulates that are derived from experiments of Ampere, Faraday, etc., aimed at to fit the theory with experiment.
As that is in all physics – every theory eventually is fitting of some mathematical constructions with experimental data, in any physical theory the Meta-physical phenomena/notions/objects not only “charge”, but also, say, “a particle”, “mass”, “Energy”, etc. are only ad hoc non-explainable mathematical values.
Again – see the SS post above, these phenomena can be, and are, clarified only in the SS&VT “The Information as Absolute” conception [the link see the post], where it is rigorously proven that everything is some informational patterns/systems of the patterns.
So, including “charge” is nothing else than some informational logical construction that exchange by information with other logical constructions in accordance with the basal set of laws/links/constants on which the whole information system “Matter” is organized, and changes/evolves.
So now there is no Meta-physical problems, the problems are purely technique – how correctly decode/translate onto human’s language some informational constructions that “are written” on some unknown for humans language; i.e. physics principally doesn’t differ from, say, a case when some Egyptologist decode some wording on some antic Egypt sarcophagus.
The next important inference that follows from the conception is in that so there is nothing surprising when some humans correctly decode information in Matter, and, since the Matter’s basal set above provides – and had provided, resulting in a huge diversity of material objects and systems of the objects, which are based on some next levels sets of laws/links/constants, to decode these laws/links/constants having practically no understanding about what the utmost fundamental and universal Matter’s basal set is – just that is the number of the physical theories that are adequate to the objective reality.
https://www.researchgate.net/post/Where-do-electrons-and-protons-get-their-charge-from

Leonard Susskind The Cosmic Landscape: String Theory and the Illusion of Intelligent Design 2006
The quark masses vary over a huge range from roughly 10 electron masses for the up-and down-quarks to 344,000 electron masses for the top-quark. Physicists puzzled for some time about why the top-quark is so heavy, but recently we have come to understand that it’s not the top-quark that is abnormal: it’s the up-and-down-quarks that are absurdly light. The fact that they are roughly twenty thousand times lighter than particles like the Z-boson and the W-boson is what needs an explanation. The Standard Model has not provided one. Thus, we can ask what the world would be like if the up-and-down quarks were much heavier than they are. Once again—disaster! Protons and neutrons are made of up-and down-quarks. According to the quark theory of protons and neutrons, the nuclear force (force between nucleons) can be traced to quarks hopping back and forth between these nucleons. If the quarks were much heavier, it would be much more difficult to exchange them, and the nuclear force would practically disappear. With no sticky flypaper force holding the nucleus together, there could be no chemistry. Luck is with us again.
https://3lib.net/book/2472017/1d5be1

Anjan Sadhukhan QUANTIZED CHARGE & FRACTIONAL CHARGE: FEW FACTS, FINDINGS AND NEW IDEAS  June 3, 2020
The physical world around us is made up of different atoms and molecules which are the building blocks of the Universe. Molecules are the collection of atoms and atoms are consisting of neutral neutrons, negatively charged electrons, and an equal number of positively charged protons. Atoms are effectively charged neutral. However, any system possessing an unequal number of electrons and protons is referred to as the ionic system. 

What is charge?
The charge on the electron is a fundamental constant of nature.
https://openstax.org/books/physics/pages/18-1-electrical-charges-conservation-of-charge-and-transfer-of-charge

It is one of the most fundamental questions of nature. Like mass, charge is also a physical property. If we place an object in the gravitational field, due to “mass” it will experience force. In the same way, a charged particle experiences force in the electromagnetic field due to the presence of “charge” in it.  It’s well accepted that e is the smallest available independent charge in the physical world. However, the composite particles like neutrons and protons are made up of some elementary particles called ‘quarks’ having smaller charges, multiple of e/3.
https://adamasuniversity.ac.in/quantized-charge-fractional-charge-few-facts-findings-and-new-ideas/

It's as if an electron says: hey quarks, let's team up, let's make an atom. Quarks: Hey, great idea, how much electric charge do you have? (Let's call it e). OK so we will team up of three of us, so we will just take each of us -1/3 or 2/3 of your charge. Electron says: Great!, I feel it, this way we can have a stable atom. Quarks: Great! ( Or maybe it was rather their creator thinking about that, in order to make, in the end, you and me ?! ) It seems that the quarks teamed up exactly so three of them can cancel out (attract) exactly the electron's electric charge. Now, here is the thing:  More elaborate composite nucleon models can have constituent partons with other rational multiples, and eventually not cancel out with the charge of the electron, and there would be no stable atoms, but ions, and no molecules, and life in the universe !! Composite particles could result in fractional charges. For example, 2 quarks and one anti-quark could sum up and result in a fractional charge ( 1/3).

So is this state of affairs coincidence, or the result of the thoughts of our wise creator?

Q & A: Quarks and Fractional Charges  12/28/2009
Protons have charge +1, and electrons, -1, using units of e. The charge of an atom or composite particle is found by adding the charges of its protons and electrons (since neutrons are electrically neutral). Therefore, the charges of such particles are integer values. However, there are subatomic particles with fractional charges. It turns out that protons and neutrons are composed of particles called "quarks." These quarks, which come in different "flavors" (up, down, charm, strange, top, bottom) make up certain particles. They have fractional charge. Up, charm, and top all have fractional charge of +2/3, while down, strange, and bottom all have a charge of -1/3. Protons are composed of two up quarks and one down quark, so the total charge is +1. Likewise, neutrons are composed of two down quarks and one up quark, so the total charge is 0. Quarks are confined to the particles they compose. This is, appropriately, referred to as "confinement." This is why we don't observe quarks--and therefore their fractional charges--outside their composite particles (such as protons and neutrons).
https://van.physics.illinois.edu/qa/listing.php?id=15151&t=quarks-and-fractional-charges

Quarks have a real fraction of the elementary charge (-1/3 or 2/3). Baryons are made up of three quarks, and that those quarks can have -1/3 or 2/3 the elementary charge. This way, the quarks can combine so that the Baryon will have an integer of the elementary charge. This way, the nucleus, and the electron can be in a stable atomic state, where their electric charges cancel (attract) exactly. Any other way, the atom would not be stable.

In reality, the charge of the electron is not an integer, it is  1.6021765 × 10−19 coulomb, or 4.80320451 × 10−10 electrostatic unit (esu, or statcoulomb).  Any amount of charge is nothing but the integer multiple of small units of charge, called the elementary charge, e, which is equal to  Coulomb (SI unit of charge). The interaction between charged objects is a non-contact force that acts over some distance of separation. 
https://www.physicsclassroom.com/class/estatics/Lesson-3/Coulomb-s-Law

In particle physics and physical cosmology, Planck units are a set of units of measurement defined exclusively in terms of four universal physical constants, in such a manner that these physical constants take on the numerical value of 1 when expressed in terms of these units.
https://en.wikipedia.org/wiki/Planck_units

The electromagnetic force is the Coulomb interaction.  This is the familiar law that says that like charges repel each and opposites attract.  This law alone dominates the interactions between essentially all objects larger than an atomic nucleus (10−15 meters) and smaller than a planet (10^7 meters).
https://www.ribbonfarm.com/2015/06/23/where-do-electric-forces-come-from/

Where do quarks get their charge?
Science has no answer to why nature uses certain fundamental rules and physical laws and not some other rules that we can conceive. That is probably a question best left to priests or philosophers.  For example, the electric charge is an intrinsic property of charged particles and it comes in 2 flavors: positive and negative. Even scientists don’t know where it comes from; it’s something that’s observed and measured. It gets worse: nobody knows how let alone why opposite charges attract and similar charges repel according to Coulomb’s law.

Those are the basic rules of Nature that we discovered. We don’t know why these are the rules. And unless we find a more fundamental rule from which these rules are deduced, we will never know the answer to that why question (and even then, we’d just replace one why with another.)

Bill C. Riemers Ph.D. Experimental High Energy Physics, Purdue University July 17, 2016
The smallest observable NET charge for a charged set of quarks is 1. But the quarks that make up the group have a fractional charge. This is far more of a complex result than anyone would have ever reasonably expected. But it is 100% consistent experimental observations. My best answer as to why quarks have a fraction charge is if there is a GOD, then GOD is both extremely clever and GOD has a really twisted sense of humor (as this seems like an Easter Egg in the laws of physics.) God also seems to appreciate the beauty of higher mathematics.
[url=https://www.quora.com/Where-do-quarks-get-their-charge#:~:text=In general particles either have,make the baryons and mesons]https://www.quora.com/Where-do-quarks-get-their-charge#:~:text=In%20general%20particles%20either%20have,make%20the%20baryons%20and%20mesons[/url]

Closer to truth: WHY COSMIC FINE-TUNING DEMANDS EXPLANATION
The electric charge on the proton is exactly equal and opposite to that of the electron to as many decimal places as you care to measure. This is more than slightly anomalous in that the proton and the electron share nothing else in common. The proton is not a fundamental particle (it is composed in turn of a trilogy of quarks), but the electron is a fundamental particle. The proton's mass is 1836 times greater than that of the electron. So how come their electric charges are equal and opposite? Why is it so? There is no reason why this state of affairs could not be different! Actually, there is an infinite number of possible different arrangements which would prohibit setting up stable atoms. Why is the electric charge on the electron is exactly equal and opposite to that of the positron, the positron being the electron's antimatter opposite? That equal but opposite charge is again verified to as many decimal places as one can calculate. So that means the electric charge on the proton and the electric charge on the positron are exactly the same, yet apart from that, the two entities are as alike as chalk and cheese. Why is it for this electric charge equality between different kinds of particles?
https://www.closertotruth.com/series/why-cosmic-fine-tuning-demands-explanation

Jason Waller Cosmological Fine-Tuning Arguments 2020, page 100
Fine-Tuning in Particle Physics  What would happen to our universe if we changed the masses of the up quark, down quark, and electron? What would happen to our universe if we change the strengths of the fundamental forces? And what would happen to our universe if we eliminated one of the four fundamental forces (gravity, electromagnetism, strong force, weak force)? In each of these cases, even relatively minor changes would make the existence of intelligent organic life (along with almost everything else in our universe!) impossible. “The neutron-to-proton mass ratio is 1.00137841870, which looks . . . uninspiring. Physically this means that the proton has very nearly the same mass as the neutron, which . . . is about 0.1 percent heavier”.  At first, it may seem as though nothing of significance hangs on this small difference. But that is wrong. All of life depends on it. The fact that the neutron’s mass is coincidentally just a little bit more than the combined mass of the proton, electron, and neutrino is what enables neutrons to decay. . . . If the neutron were lighter . . . yet only by a fraction of 1 percent, it would have less mass than the proton, and the tables would be turned: isolated protons, rather than neutrons, would be unstable. Isolated neutrons will decay within about fifteen minutes because particles tend toward the lowest possible energy consistent with the conservations laws. Given this tendency of fundamental particles, the slightly higher mass of the neutron means that the proton “is the lightest particle made from three quarks. The proton is stable because there is no lighter baryon to decay into. The ability of the proton to survive for an extended period without decaying is essential for the existence of life. In the early universe, there was a “hot, dense soup of particles and radiation,” and as the universe began to cool, the heavier elements decayed into protons and neutrons. The Universe managed to lock some neutrons away inside nuclei in the first few minutes before they decayed. Isolated protons were still important chemically because they could interact with electrons. In fact, a single proton is taken as an element even in the absence of any electrons—it is called hydrogen. But now imagine an alternative physics where the neutron was less massive, then free protons would decay into neutrons and positrons, with disastrous consequences for life, because without protons there could be no atoms and no chemistry. Neutrons are electrically neutral and so will not interact with electrons. That means a universe dominated by neutrons would have almost no interesting chemistry. A decrease in the neutron’s mass by 0.8 MeV would entail an “initially all neutron universe.  It is rather easy to arrange a universe to have no chemistry at all. If we examine a range of different possible masses for the up and down quarks (and so the proton and neutron), we can conclude that almost all changes lead to universes with no chemistry. Thus, there are firm and fairly narrow limits on the relative masses of the up and down quarks if our universe is going to have any interesting chemistry.

https://reasonandscience.catsboard.com

Otangelo


Admin

When I first read about the laws and constants of physics, I had a shady, unclear understanding of what that means. One has to invest some time and study to go somewhere and gain some understanding.

Our universe is orderly and predictable because of these rules and laws. There is no deeper reason, or physical necessity, that they should be instantiated at all, or is in operation, or that our universe should be orderly and follow a predictable path, dictated based on these laws, in as much as there is no reason why a computer operates based on a software program - UNLESS - someone, somebody with intelligence, has a goal, wants to achieve a specific outcome.

We could not exist without a minimal set of finely adjusted and set up laws. Certain initial conditions have to exist, and they have to come from somewhere. Of course, you can resort to the unknown, but is that not intellectually highly unsatisfying - in special, when we KNOW by our own experience, that we set up things for specific purposes all the time?

https://reasonandscience.catsboard.com

Otangelo


Admin

What instantiates and secures the forces that operate in the universe?

The true truth, that is the one that corresponds to reality, is one since reality is only one. Which is it? This is the biggest philosophical question. Based on the presuppositional approach, God is not a question of probability. God is necessary by virtue of the impossible of the contrary. The international system of units is based on the fundamental properties of the universe, namely time, length, mass, electric current, thermodynamic temperature, amount of substance, and light intensity. These are fundamental properties that are the most basic in our world. They are themselves ungrounded in anything deeper and are the basis of all other things. So you can't dig deeper. Now here's the thing: These properties are fundamental constants that are like the DNA of our Universe. They cannot be calculated from still deeper principles currently known. The constants of physics are fundamental numbers that, when inserted into the laws of physics, determine the basic structure of the universe. These constants have a 1. fixed value and 2. are right to allow for a universe that allows for life. For life to emerge in our Universe, the fundamental constants could not have been more than a fraction of a percentage point from their actual values. The BIG question is: why is this so? These constants cannot be derived from other constants and must be verified by experiment. In a nutshell: science has no answers and doesn't know why they have the value they have. It is easy to imagine a universe where conditions change unpredictably from one moment to the next or even a universe where things pop in and out of existence. Not only must there be an agency to instantiate and secure the conditions of the universe, but the forces must also be secured so that there is no chaos. We know that fundamental forces do not change throughout the universe. This allows the coupling constants to be right, which holds the atoms together. This is one of the reasons, outside the fifth way of Aquinas, for which according to me, the question of whether God exists, or does not exist, is not a question of probability. God is needed to instantiate and maintain the forces of the universe in a stable way.

The Fundamental Properties of Nature, the universal constants—the cosmic numbers which define our Universe where did they come from ?
https://reasonandscience.catsboard.com/t3165-the-fundamental-properties-of-nature-where-did-they-come-from

What instantiates and secures the forces that operate in the universe?
https://reasonandscience.catsboard.com/t3156-what-instantiates-and-secures-the-forces-that-operate-in-the-universe

The Fundamental Properties of Nature, where did they come from ?  DgEr26QU8AEf0YW

https://reasonandscience.catsboard.com

Sponsored content



Back to top  Message [Page 1 of 1]

Permissions in this forum:
You cannot reply to topics in this forum