Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

What is the entropy function and how to use it?

Status
Not open for further replies.

Grig

Full Member level 4
Joined
Sep 3, 2004
Messages
220
Helped
36
Reputation
72
Reaction score
12
Trophy points
1,298
Activity points
1,469
HI, All

is there some numerical example how the ENTROPY function can be used?

Why dQ/T is complete differential?

Thanks
 

Re: Entropy -What is it?

Entropy is a measure of the disorder or randomness in a closed system or a measure of the loss of information in a transmitted message.

Look at **broken link removed** there is a good image example.
 

    Grig

    Points: 2
    Helpful Answer Positive Rating
Entropy -What is it?

Quoted from the US DOE Fundamentals Handbook

Entropy (S) is a property of a substance, as are pressure, temperature, volume, and enthalpy.
Because entropy is a property, changes in it can be determined by knowing the initial and final
conditions of a substance. Entropy quantifies the energy of a substance that is no longer
available to perform useful work. Because entropy tells so much about the usefulness of an
amount of heat transferred in performing work, the steam tables include values of specific
entropy (s = S/m) as part of the information tabulated. Entropy is sometimes referred to as a
measure of the inability to do work for a given heat transferred.
 

    Grig

    Points: 2
    Helpful Answer Positive Rating
Re: Entropy -What is it?

This are OK

But for ex. i know how to use term energy to solve some problem

is the some example when entropy can be used to predict somthing numericaly?

Regards
 

Re: Entropy -What is it?

When we talk about entropy, we usually talk about the change of entropy. Entropy differs from energy in that it does not obey a conseration law. The energy of a closed system is conserved; it always remains constant. However for irreversible processes, the entropy of a closed system always increases. Because of this property, the change in entropy is sometimes called 'the arrow of time'. For example, we associate the egg of our opening photograph, breaking irreversibly as it drops into a cup, with the forward direction of time and with the increase in entropy. The backward direction of time (a videotape run backward) would correspond to the broken egg re-forming into a whole egg and rising into the air. This backward process, which would result in an entropy decrease, never happens.
 

    Grig

    Points: 2
    Helpful Answer Positive Rating
Re: Entropy -What is it?

OK Thanks

And.... How Entropy can be used for electronic circuit analyzis?
Give me some example Please

Regards
 

Re: Entropy -What is it?

Grig said:
OK Thanks

And.... How Entropy can be used for electronic circuit analyzis?
Give me some example Please

Regards

well, think of two identical transistors biased with same current source, I. now you would expect that the current flowing through each transistor is I/2. this is almost obvious since i said that transistors are identical, but why the combination I/3 and 2/3I is not realistic? i think that the system prefers combination I/2,I/2 just because entropy should increase. when you flip 100 coins, you expect 50 tails and 50 heads; also thats the combination which maximises the entropy. when you have current I, you expect its divided into I/2 and I/2, thats the combination maximising the entropy. this is same as saying that when you have 2 rooms with temperature 3T and T, they will reach the equilibrium at 2T.

does that answer your question?
 

    Grig

    Points: 2
    Helpful Answer Positive Rating
Re: Entropy -What is it?

hi
as far as i know entropy is ameasure of chaos or uncertaintyin asystemand its alaways positive and increasing in a real system sych as the universe .

entropy is used to estimate the amount of information or knowledge in a system see abarmson book " information theory and coding "


best regards
 

Entropy -What is it?

meaure ov disorderness ov system.....is called entropy...entropy of universe is always increasing...coz ov disorder in daily life through various means.....
 

Entropy -What is it?

wait a minute......Grig is asking for Thermodynamic entropy!!!!!!........some aswer are for the information
entropy
 

Entropy -What is it?

it is a measure of amount of uncertainity present in a information...it's maximum value is one ....
 

Entropy -What is it?

entropy is a measure of disorderness in the system. Greater the entropy more will be the randomness or disorderness in the system. In thermodynamics, when a system is given heat energy, motion of the constituting molecules increases and we say that it's entropy has incresed.
 

Re: Entropy -What is it?

In the digital sense,entropy is the average information.
If there is a source emitting 'n' different symbols per second,then the entropy is calculated as (information/n).
informaton is directly proportional to the probability of occurence of the symbol.
More the probabaility,less the information and vice versa.
 

Re: Entropy -What is it?

keep in mind that entropy acts differently when on the quantum scale (as does most everything else).

comments welcome!

Mr.Cool
 

Entropy -What is it?

Entropy has different definition in different field from chemistry to communication and physics
so the main point of that is the randomness of the system
 

Entropy -What is it?

ahhh.. but the 2nd law of thermodynamics makes it very clear what the definition is.

but it applies only statistically!!

so it is more of a "probability" that things will work this way. the probability is based on the relative size of the subject in question.

if you deal in quantum state, then the probability that the 2nd law of thermodynamics is correct is 99.9999% WRONG.

so... not quite a "law" any more is it? its more a "2nd description of thermodynamics that applies statistically to things on a big scale only".

by the way, "Ohms Law" also falls into this category. it is more of a description that applies when you study circuits that are specifically designed to abide by ohm's law! ha.. no wonder it works so well ;P

Mr.Cool
 

Re: Entropy -What is it?

to know mpre information about entropy in details u should read many thing about thermodynamic
 

Re: Entropy -What is it?

Greek: εν (en=inside) + verb: τρέπω (trepo= to chase, escape, rotate, turn)) in thermodynamics is central to the second law of thermodynamics, which deals with physical processes and whether they occur spontaneously. Spontaneous changes occur with an increase in entropy. Spontaneous changes tend to smooth out differences in temperature, pressure, density, and chemical potential that may exist in a system, and entropy is thus a measure of how far this smoothing-out process has progressed. In contrast, the first law of thermodynamics deals with the concept of energy, which is conserved. Entropy change has often been defined as a change to a more disordered state at a molecular level. In recent years, entropy has been interpreted in terms of the "dispersal" of energy. Entropy is an extensive state function that accounts for the effects of irreversibility in thermodynamic systems.
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top