StudySmarter - The all-in-one study app.

4.8 • +11k Ratings

More than 3 Million Downloads

Free

Suggested languages for you:

Americas

Europe

Entropy

Imagine a 2x2 Rubik's cube, solved so that each face contains just one colour. Take it into your hands, shut your eyes, and twist the sides around randomly a few times. Now open your eyes again. The cube could now have all sorts of possible arrangements. What are the chances that it is still perfectly solved after twisting it around…

Content verified by subject matter experts

Free StudySmarter App with over 20 million students

Explore our app and discover over 50 million learning materials for free.

Entropy

- Chemical Analysis
- Formulations
- Instrumental Analysis
- Pure Substances
- Sodium Hydroxide Test
- Test for Anions
- Test for Metal Ions
- Testing for Gases
- Testing for Ions
- Chemical Reactions
- Acid-Base Reactions
- Acid-Base Titration
- Bond Energy Calculations
- Decomposition Reaction
- Displacement Reactions
- Electrolysis of Aqueous Solutions
- Electrolysis of Ionic Compounds
- Energy Changes
- Extracting Metals
- Extraction of Aluminium
- Fuel Cells
- Hydrates
- Making Salts
- Net Ionic Equations
- Percent Composition
- Physical and Chemical Changes
- Precipitation Reaction
- Reactions of Acids
- Reactivity Series
- Redox Reactions
- Redox Titration
- Representing Chemical Reactions
- Single and Double Replacement Reactions
- Skeleton Equation
- Stoichiometric Calculations
- Stoichiometry
- Synthesis Reaction
- Types of Chemical Reactions
- Chemistry Branches
- Inorganic Chemistry
- Catalysts
- Chlorine Reactions
- Group 1
- Group 2
- Group 2 Compounds
- Group 2 Reactivity
- Halogens
- Ion Colours
- Nitrogen
- Nitrous Oxide
- Period 3 Elements
- Period 3 Oxides
- Periodic Table
- Periodic Trends
- Properties of Halogens
- Properties of Transition Metals
- Reactions of Halides
- Reactions of Halogens
- Redox Potential Of Transition Metals
- Shapes of Complex Ions
- Stability Constant
- Test Tube Reactions
- Titrations
- Transition Metal Ions in Aqueous Solution
- Transition Metals
- Variable Oxidation State of Transition Elements
- Ionic and Molecular Compounds
- Bond Hybridization
- Bond Length
- Bonding and Elemental Properties
- Coulomb Force
- Formal Charge
- Interstitial and Substitutional Alloys
- Intramolecular Force and Potential Energy
- Lattice Energy
- Lewis Dot Diagrams
- Limitations of Lewis Dot Structure
- Naming Ionic Compounds
- Polar and Non-Polar Covalent Bonds
- Potential Energy Diagram
- Properties of Covalent Compounds
- Resonance Chemistry
- Saturated Bond
- Sigma and Pi Bonds
- Structure of Ionic Solids
- Structure, Composition & Properties of Metals and Alloys
- The Octet Rule
- Types of Chemical Bonds
- VSEPR
- Kinetics
- Activation Energy
- Catalysis
- Concentration
- Energy Profile
- First Order Reaction
- Multistep Reaction
- Pre-equilibrium Approximation
- Rate Constant
- Rate Law
- Reaction Rates
- Second Order Reactions
- Steady State Approximation
- Steady State Approximation Example
- The Change of Concentration with Time
- Zero Order Reaction
- Making Measurements
- Accuracy and Precision
- Analytical Chemistry
- Chemistry Lab Equipment
- Lab Safety
- Lab Temperature Monitoring
- Nuclear Chemistry
- Balancing Nuclear Equations
- Carbon Dating
- Mass Energy Conversion
- Radioactive Dating
- Radioactive Isotopes
- Spontaneous Decay
- Transmutation
- Organic Chemistry
- Acylation
- Alcohol Elimination Reaction
- Alcohols
- Aldehydes and Ketones
- Alkanes
- Alkenes
- Amide
- Amines
- Amines Basicity
- Amino Acids
- Anti-Cancer Drugs
- Aromatic Chemistry
- Aryl Halide
- Benzene Structure
- Biodegradability
- Carbon
- Carbon -13 NMR
- Carbonyl Group
- Carboxylic Acid Derivatives
- Carboxylic Acids
- Chlorination
- Chromatography
- Column Chromatography
- Combustion
- Condensation Polymers
- Cracking (Chemistry)
- Drawing Reaction Mechanisms
- Electrophilic Addition
- Electrophilic Substitution of Benzene
- Elimination Reactions
- Esterification
- Esters
- Fractional Distillation
- Functional Groups
- Gas Chromatography
- Halogenation of Alcohols
- Halogenoalkanes
- Hydrogen -1 NMR
- Hydrolysis of Halogenoalkanes
- IUPAC Nomenclature
- Infrared Spectroscopy
- Isomerism
- NMR Spectroscopy
- Natural Polymers
- Nitriles
- Nucleophiles and Electrophiles
- Nucleophilic Substitution Reactions
- Optical Isomerism
- Organic Analysis
- Organic Chemistry Reactions
- Organic Compounds
- Organic Synthesis
- Oxidation of Alcohols
- Ozone Depletion
- Paper Chromatography
- Phenol
- Polymerisation Reactions
- Preparation of Amines
- Production of Ethanol
- Properties of Polymers
- Purification
- R-Groups
- Reaction Mechanism
- Reactions of Aldehydes and Ketones
- Reactions of Alkenes
- Reactions of Benzene
- Reactions of Carboxylic Acids
- Reactions of Esters
- Structure of Organic Molecules
- Thin Layer Chromatography Practical
- Thin-Layer Chromatography
- Understanding NMR
- Uses of Amines
- Physical Chemistry
- Absolute Entropy and Entropy Change
- Acid Dissociation Constant
- Acid-Base Indicators
- Acid-Base Reactions and Buffers
- Acids and Bases
- Alkali Metals
- Allotropes of Carbon
- Amorphous Polymer
- Amount of Substance
- Application of Le Chatelier's Principle
- Arrhenius Equation
- Arrhenius Theory
- Atom Economy
- Atomic Structure
- Autoionization of Water
- Avogadro Constant
- Avogadro's Number and the Mole
- Beer-Lambert Law
- Bond Enthalpy
- Bonding
- Born Haber Cycles
- Born-Haber Cycles Calculations
- Boyle's Law
- Brønsted-Lowry Acids and Bases
- Buffer Capacity
- Buffer Solutions
- Buffers
- Buffers Preparation
- Calculating Enthalpy Change
- Calculating Equilibrium Constant
- Calorimetry
- Carbon Structures
- Cell Potential
- Cell Potential and Free Energy
- Chalcogens
- Chemical Calculations
- Chemical Equations
- Chemical Equilibrium
- Chemical Thermodynamics
- Closed Systems
- Colligative Properties
- Collision Theory
- Common-Ion Effect
- Composite Materials
- Composition of Mixture
- Constant Pressure Calorimetry
- Constant-Volume Calorimetry
- Coordination Compounds
- Coupling Reactions
- Covalent Bond
- Covalent Network Solid
- Crystalline Polymer
- De Broglie Wavelength
- Determining Rate Constant
- Deviation From Ideal Gas Law
- Diagonal Relationship
- Diamond
- Dilution
- Dipole Chemistry
- Dipole Moment
- Dissociation Constant
- Distillation
- Dynamic Equilibrium
- Electric Fields Chemistry
- Electrochemical Cell
- Electrochemical Series
- Electrochemistry
- Electrode Potential
- Electrolysis
- Electrolytes
- Electromagnetic Spectrum
- Electron Affinity
- Electron Configuration
- Electron Shells
- Electronegativity
- Electronic Transitions
- Elemental Analysis
- Elemental Composition of Pure Substances
- Empirical and Molecular Formula
- Endothermic and Exothermic Processes
- Energetics
- Energy Diagrams
- Enthalpy Changes
- Enthalpy for Phase Changes
- Enthalpy of Formation
- Enthalpy of Reaction
- Enthalpy of Solution and Hydration
- Entropy
- Entropy Change
- Equilibrium Concentrations
- Equilibrium Constant Kp
- Equilibrium Constants
- Examples of Covalent Bonding
- Factors Affecting Reaction Rates
- Finding Ka
- Free Energy
- Free Energy and Equilibrium
- Free Energy of Dissolution
- Free Energy of Formation
- Fullerenes
- Fundamental Particles
- Galvanic and Electrolytic Cells
- Gas Constant
- Gas Solubility
- Gay-Lussac's Law
- Giant Covalent Structures
- Graham's Law
- Graphite
- Ground State
- Group 3A
- Group 4A
- Group 5A
- Half Equations
- Heating Curve for Water
- Heisenberg Uncertainty Principle
- Henderson-Hasselbalch Equation
- Hess' Law
- Hybrid Orbitals
- Hydrogen Bonds
- Ideal Gas Law
- Ideal and Real Gases
- Intermolecular Forces
- Introduction to Acids and Bases
- Ion and Atom Photoelectron Spectroscopy
- Ion dipole Forces
- Ionic Bonding
- Ionic Product of Water
- Ionic Solids
- Ionisation Energy
- Ions: Anions and Cations
- Isotopes
- Kinetic Molecular Theory
- Lattice Structures
- Law of Definite Proportions
- Le Chatelier's Principle
- Lewis Acid and Bases
- London Dispersion Forces
- Magnitude of Equilibrium Constant
- Mass Spectrometry
- Mass Spectrometry of Elements
- Maxwell-Boltzmann Distribution
- Measuring EMF
- Mechanisms of Chemical Bonding
- Melting and Boiling Point
- Metallic Bonding
- Metallic Solids
- Metals Non-Metals and Metalloids
- Mixtures and Solutions
- Molar Mass Calculations
- Molarity
- Molecular Orbital Theory
- Molecular Solid
- Molecular Structures of Acids and Bases
- Moles and Molar Mass
- Nanoparticles
- Neutralisation Reaction
- Oxidation Number
- Partial Pressure
- Particulate Model
- Partition Coefficient
- Percentage Yield
- Periodic Table Organization
- Phase Changes
- Phase Diagram of Water
- Photoelectric Effect
- Photoelectron Spectroscopy
- Physical Properties
- Polarity
- Polyatomic Ions
- Polyprotic Acid Titration
- Prediction of Element Properties Based on Periodic Trends
- Pressure and Density
- Properties of Buffers
- Properties of Equilibrium Constant
- Properties of Solids
- Properties of Water
- Quantitative Electrolysis
- Quantum Energy
- Quantum Numbers
- RICE Tables
- Rate Equations
- Rate of Reaction and Temperature
- Reacting Masses
- Reaction Quotient
- Reaction Quotient and Le Chatelier's Principle
- Real Gas
- Redox
- Relative Atomic Mass
- Representations of Equilibrium
- Reversible Reaction
- SI units chemistry
- Saturated Unsaturated and Supersaturated
- Shapes of Molecules
- Shielding Effect
- Simple Molecules
- Solids Liquids and Gases
- Solubility
- Solubility Curve
- Solubility Equilibria
- Solubility Product
- Solubility Product Calculations
- Solutes Solvents and Solutions
- Solution Representations
- Solutions and Mixtures
- Specific Heat
- Spectroscopy
- Standard Potential
- States of Matter
- Stoichiometry in Reactions
- Strength of Intermolecular Forces
- The Laws of Thermodynamics
- The Molar Volume of a Gas
- Thermodynamically Favored
- Trends in Ionic Charge
- Trends in Ionisation Energy
- Types of Mixtures
- VSEPR Theory
- Valence Electrons
- Van der Waals Forces
- Vapor Pressure
- Water in Chemical Reactions
- Wave Mechanical Model
- Weak Acid and Base Equilibria
- Weak Acids and Bases
- Writing Chemical Formulae
- pH
- pH Change
- pH Curves and Titrations
- pH Scale
- pH and Solubility
- pH and pKa
- pH and pOH
- The Earths Atmosphere

Save the explanation now and read when you’ve got time to spare.

SaveLerne mit deinen Freunden und bleibe auf dem richtigen Kurs mit deinen persönlichen Lernstatistiken

Jetzt kostenlos anmeldenNie wieder prokastinieren mit unseren Lernerinnerungen.

Jetzt kostenlos anmeldenImagine a 2x2 Rubik's cube, solved so that each face contains just one colour. Take it into your hands, shut your eyes, and twist the sides around randomly a few times. Now open your eyes again. The cube could now have all sorts of possible arrangements. What are the chances that it is still perfectly solved after twisting it around blindly for a couple of minutes? They're pretty low! Instead, it is quite likely that your cube isn't perfectly solved - the faces all contain a mixture of different colours. Under random action, you could say that the faces of the cube have gone from ordered and exact to a random configuration. This idea of a neat arrangement spreading out into total chaos is a good starting point for **entropy**: a measure of disorder in a thermodynamic system.

- This article is about
**entropy**in physical chemistry. - We'll start by learning the
**definition of entropy**and its**units**. - We'll then look at
**entropy changes**, and you'll be able to practice calculating enthalpy changes of reaction. - Finally, we'll explore the
**second law of thermodynamics****feasible reactions**. You'll find out how entropy, enthalpy, and temperature determine the feasibility of a reaction through a value known as**G****ibbs free energy**.

In the introduction to this article, we gave you one definition of entropy.

**Entropy** **(S)** is a measure of **disorder **in a **thermodynamic system**.

However, we can also describe entropy differently.

**Entropy** **(S)** is the number of possible ways that particles and their energy can be **distributed **in a system.

The two definitions seem very different. However, when you break them down, they start to make a little more sense.

Let's revisit the Rubik’s cube. It starts off ordered - each face contains just one colour. The first time you twist it, you disrupt the order. The second time you twist it, you *might* undo your first move and restore the cube to its original, perfectly solved arrangement. But it is more likely that you will rotate a different side and disrupt the order even more. Each time you randomly twist the cube, you increase the number of possible configurations that your cube could take, decrease the chance of landing upon that perfectly solved arrangement, and get more and more disordered.

Now, imagine a 3x3 Rubik's Cube. This complex cube has many more moving parts than the first, and so has more possible permutations. If you shut your eyes and twist the sides around blindly once more, the odds of chancing upon a solved cube when you open them again are even slimmer - it is extremely unlikely that your cube will have anything but a totally random, disordered configuration.** A larger cube with more individual pieces has a greater tendency to become disordered**, simply because there are so **many more ways that it can be arranged**. For example, a simple 2x2 Rubik's cube has over 3.5 million possible permutations. A standard 3x3 cube has 45 quintillion combinations - that's the number 45 followed by 18 zeros! However, a 4x4 cube trumps them all with a mind-blowing 7.4 quattuordecillion combinations^{1}. Ever heard of a number that large before? It is 74 followed by 44 zeros! But for all of those cubes, there is only one solved arrangement, and so the odds of randomly stumbling across that perfect combination decrease.

Notice something? As time goes on, the cube goes from solved to randomly arranged, **from a state of order to ****disorder**. In addition, as the **number of moving pieces increases**, the **tendency to become more disordered increases** because the cube has a **larger number of possible arrangements**.

Let’s now relate this to entropy. Imagine that each sticker represents a certain particle and amount of energy. The energy starts off neatly **arranged **and **ordered**, but quickly becomes **randomly arranged** and **disordered**. The larger cube has more stickers, and so has more particles and units of energy. As a result, there are more possible configurations of stickers and **more possible arrangements of particles and their energy**. In fact, it is a lot easier for the particles to move away from that perfectly ordered arrangement. With each move away from the starting configuration, the particles and their energy become more and more randomly dispersed, and** more and more disordered**. This fits with our two definitions of entropy:

The larger cube has a

**higher number of possible arrangements of particles and their energy**than the smaller cube, and so has a**greater entropy**.The larger cube tends to be

**more disordered**than the smaller cube, and so has a**greater entropy**.

Now that we have a bit of an understanding of entropy, let’s look at some of its properties:

Systems with a

**higher number of particles**or**more units of energy**have a**greater entropy**because they have more**possible distributions**.**Gases****have a greater entropy than solids**because the particles can move around much more freely and so have more possible ways of being arranged.**Increasing the temperature****of a system**increases its entropy because you supply the particles with more energy.**More complex species**tend to have a**higher entropy**than simple species because they have more energy.**Isolated systems tend towards a greater entropy**. This is given to us by the**second law of thermodynamics**.**Increasing entropy increases the energetic stability of a system**because the energy is more evenly distributed.

What do you think the **units of entropy **are? We can work them out by considering what entropy depends on. We know that it is a measure of **energy**, and is affected by **temperature** and the **number of particles**. Therefore, entropy takes the units **J·K**^{-1·}**mol**** ^{-1}**.

Note that unlike** enthalpy**, entropy uses **joules**, not** kilojoules**. This is because a unit of entropy is smaller (in order of magnitude) than a unit of enthalpy. Head over to **Enthalpy Changes **to find out more.

To compare entropy values, we often use entropy under **standard conditions**. These conditions are the same as the ones used for **standard enthalpies**:

A temperature of

**298K**.A pressure of

**100kPa**.All species in their

**standard states**.

Standard entropy is represented by the symbol **S°.**

Entropy cannot be measured directly. However, we can measure the **change in entropy (ΔS****)**. We typically do this using standard entropy values, which have already been calculated and verified by scientists.

**Entropy change** **(ΔS****) **measures the change in disorder caused by a reaction.

Each reaction firstly causes an **entropy change within the system **- that is, within the reacting particles themselves. For example, a solid might turn into two gases, which increases the total entropy. If the system is **completely isolated**, this is the only entropy change that takes place. However, isolated systems don't exist in nature; they are **purely hypothetical**. Instead, reactions also affect the **entropy of their surroundings**. For example, a reaction might be exothermic and release energy, which increases the entropy of the surroundings.

We’ll start by looking at the formula for the **entropy change within a system** (commonly simply known as the **entropy change of a reaction**, or just **entropy change**), before taking a deep dive into the **entropy change of the surroundings** and the **total entropy change**.

Most exam boards only expect you to be able to calculate the** entropy change of a reaction**, not the surroundings. Check *your* specification to find out what is required of you from your examiners.

The **entropy change of a reaction** (which, you’ll remember, is also called the **entropy change of the system**) measures the **difference in entropy between the products and the reactants in a reaction**. For example, imagine your reactant is the perfectly solved Rubik’s cube, and your product is a randomly arranged cube. The product has a **much higher entropy** than the reactant, and so there is a** positive entropy change**.

We work out the standard entropy change of reaction, represented by **ΔS****°**_{system} or just **ΔS**°, using the following equation:

$$\Delta S^\circ = {\Delta S^\circ}_{products}-{\Delta S^\circ}_{reactants}$$

1) Don’t worry - you aren’t expected to remember standard entropy values! You’ll be provided with them in your exam.

2) For examples of entropy changes, including the chance to calculate them yourself, check out **Entropy Changes**.

Let’s now see how we can use what we know about entropy to predict the possible entropy change of a reaction. This is a quick way to estimate entropy changes without doing any calculations. We predict the entropy change of a reaction by looking at its equation:

A

**positive entropy change of reaction**means the entropy of the system**increases**and the products have a**higher**entropy than the reactants. This could be caused by:A

**change of state**from**solid to liquid**or**liquid to gas**.An

**increase in the number of molecules**. In particular, we look at the**number of gaseous molecules**.An

**endothermic reaction**that takes in heat.

A

**negative entropy change of reaction**means that the entropy of the system**decreases**, and the products have a**lower**entropy than the reactants. This could be caused by:A

**change of state**from**gas to liquid**or**liquid to solid**.A

**decrease in the number of molecules**. Once again, we look closely at the**number of gaseous molecules**.An

**exothermic reaction**that releases heat.

In real life, reactions don’t just result in an entropy change within the **system** - they also cause an entropy change in the **surroundings**. This is because the system isn’t isolated, and the heat energy absorbed or released during the reaction affects the surrounding environment’s entropy. For example, if a reaction is **exothermic**, it releases heat energy, which heats up the environment and causes a **positive** entropy change in the surroundings. If a reaction is **endothermic**, it absorbs heat energy, cooling the environment and causing a **negative **entropy change in the surroundings.

We calculate the standard entropy change of surroundings using the following formula:

$${\Delta S^\circ}_{surroundings}=\frac{{-\Delta H^\circ}_{reaction}}{T}$$

Note that here, T is the temperature that the reaction takes place at, in K. For standard entropy changes, this is always 298 K. However, you can also measure *non-standard* entropy changes - just make sure you use the right value for temperature!

Lastly, let's consider one final entropy change: **total entropy change**. Overall, it tells us whether a reaction causes an **increase**** in entropy **or a **decrease in entropy**, taking into consideration the entropy changes of both the **system** and the **surroundings**.

Here’s the formula:

$${\Delta S^\circ}_{total}={\Delta S^\circ}_{system}+{\Delta S^\circ}_{surroundings}$$

Using the formula for the entropy change of the surroundings that we found out above:

$${\Delta S^\circ}_{total}={\Delta S^\circ}_{system}-\frac{{\Delta H^\circ}_{reaction}}{T}$$

The total entropy change is very useful because it helps us predict whether a reaction is **feasible** or not. Don’t worry if you haven’t heard of this term before - we’ll visit it next.

We learned earlier that, according to the **second law of thermodynamics**, isolated systems tend towards a **greater entropy**. We can therefore predict that reactions with a **positive entropy change** happen on their own accord; we call such reactions **feasible**.

**Feasible** (or **spontaneous**) reactions are reactions that take place **by themselves**.

But many feasible day-to-day reactions *don’t* have a positive entropy change. For example, both rusting and photosynthesis have negative entropy changes, and yet they are everyday occurrences! How can we explain this?

Well, like we explained above, it is because natural chemical systems *aren’t* isolated. Instead, they interact with the world around them and so have some sort of effect on the entropy of their surroundings. For example, **exothermic reactions release heat energy**, which **increases** their surrounding environment’s entropy, whilst **endothermic reactions**** absorb heat energy**, which **decreases **their surrounding environment’s entropy. Whilst *total* entropy** **always increases, the entropy of the *system* doesn’t necessarily increase, provided the entropy change of the *surroundings* makes up for it.

So, reactions with a positive total energy change are **feasible**. From looking at how a reaction affects the entropy of its surroundings, we can see that feasibility depends on a few different factors:

The

**entropy change of the reaction**,**ΔS°**(also known as the**entropy change of the system**, or just**entropy change**).The

**enthalpy change of the reaction**,**ΔH°**.The

**temperature**at which the reaction takes place, in K.

The three variables combine to make something called the **change in** **Gibbs free energy**.

**The change in Gibbs free energy (ΔG)** is a value that tells us about the feasibility of a reaction. For a reaction to be feasible (or spontaneous), ΔG must be negative.

Here’s the formula for the change in standard Gibbs free energy:

$$\Delta G^\circ={\Delta H^\circ}-T\Delta S^{\circ}$$

Like enthalpy, it takes the units kJ·mol^{-1}.

You can also calculate Gibbs free energy changes for *non-standard* reactions. Make sure to use the right value for temperature!

The change Gibbs free energy explains why many reactions with negative entropy changes are spontaneous. **An extremely exothermic reaction with a negative entropy change can be feasible**, provided ΔH is large enough and TΔS is small enough. This is why reactions such as rusting and photosynthesis take place.

You can practice calculating ΔG in the article **Free Energy**. There, you’ll also see how temperature affects the feasibility of a reaction, and you’ll be able to have a go at finding the temperature at which a reaction becomes spontaneous.

Feasibility all depends on the **total entropy change**. According to the second law of thermodynamics, **isolated systems tend towards a greater entropy**, and so the total entropy change for feasible reactions is always **positive**. In contrast, the value of Gibbs free energy change for feasible reactions is always negative.

We now know how to find both total entropy change and the change in Gibbs free energy. Can we use one formula to derive the other?

$${\Delta S^\circ}_{total}={\Delta S^\circ}_{system}-\frac{{\Delta H^\circ}_{reaction}}{T}$$

Multiply by T:

$$T{\Delta S^\circ}_{total}=T{\Delta S^\circ}_{system}-{\Delta H^\circ}_{reaction}$$

Divide by -1, then rearrange:

$$-T{\Delta S^\circ}_{total}={\Delta H^\circ}_{reaction}-T{\Delta S^\circ}_{system}$$

The units of entropy are J K^{-1} mol^{-1}, whilst the units of Gibbs free energy are kJ mol^{-1}.

Therefore:

TΔS° _{total} is a version of Gibbs free energy. We've successfully rearranged the equations!

**Entropy (ΔS)**has two definitions:- Entropy is a measure of disorder in a system.
- It is also the number of possible ways that particles and their energy can be distributed in a system.

- The
**second law of thermodynamic**s tells us that**isolated systems always tend towards a greater entropy**. **Standard entropy values (****ΔS°)**are measured under**standard conditions**of**298K**and**100 kPa**, with all species in**standard states**.- The
**standard entropy change of a reaction**(also known as the**entropy change of the system**, or just**entropy change**) is given by the formula \(\Delta S^\circ = {\Delta S^\circ}_{products}-{\Delta S^\circ}_{reactants}\) **Feasible**(or**spontaneous**) reactions are reactions that take place of their own accord.- The entropy change of a reaction isn’t enough to tell us if a reaction is feasible or not. We need to consider the
**total entropy change**, which takes enthalpy change and temperature into account. This is given to us by the**change in Gibbs free energy****(****ΔG)**.**Standard Gibbs free energy change****(****ΔG°)**has the formula:\(\Delta G^\circ={\Delta H^\circ}-T\Delta S^{\circ}\)

- 'How Many Possible Rubik’s Cube Combinations Are There? - GoCube'. GoCube (29/05/2020)

An example of entropy is a solid dissolving in solution or a gas diffusing around a room.

*can* decrease. However, if you look at the total entropy change, which includes the entropy change of the system's surroundings, entropy always increases as a whole.

You calculate the entropy change of a reaction (also known as the entropy change of the system, ΔS°_{system}**,** or just entropy change, ΔS°) using the formula ΔS° = ΔS°_{products} - ΔS°_{reactants}.

You can also calculate the entropy change of the surroundings with the formula ΔS°_{surroundings} = -ΔH°/T.

Finally, you can work out the total entropy change caused by a reaction using the formula ΔS°_{total} = ΔS°_{system} + ΔS°_{surroundings}

More about Entropy

How would you like to learn this content?

Creating flashcards

Studying with content from your peer

Taking a short quiz

How would you like to learn this content?

Creating flashcards

Studying with content from your peer

Taking a short quiz

Free chemistry cheat sheet!

Everything you need to know on . A perfect summary so you can easily remember everything.

Be perfectly prepared on time with an individual plan.

Test your knowledge with gamified quizzes.

Create and find flashcards in record time.

Create beautiful notes faster than ever before.

Have all your study materials in one place.

Upload unlimited documents and save them online.

Identify your study strength and weaknesses.

Set individual study goals and earn points reaching them.

Stop procrastinating with our study reminders.

Earn points, unlock badges and level up while studying.

Create flashcards in notes completely automatically.

Create the most beautiful study materials using our templates.

Sign up to highlight and take notes. It’s 100% free.

Save explanations to your personalised space and access them anytime, anywhere!

Sign up with Email Sign up with AppleBy signing up, you agree to the Terms and Conditions and the Privacy Policy of StudySmarter.

Already have an account? Log in