Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

What is the entropy function and how to use it?

Status
Not open for further replies.

Grig

Full Member level 4
Joined
Sep 3, 2004
Messages
222
Helped
36
Reputation
72
Reaction score
12
Trophy points
1,298
Activity points
1,469
HI, All

is there some numerical example how the ENTROPY function can be used?

Why dQ/T is complete differential?

Thanks
 

mirvidon

Member level 1
Joined
Nov 14, 2005
Messages
41
Helped
4
Reputation
8
Reaction score
0
Trophy points
1,286
Location
Spain
Activity points
1,613
Re: Entropy -What is it?

Entropy is a measure of the disorder or randomness in a closed system or a measure of the loss of information in a transmitted message.

Look at **broken link removed** there is a good image example.
 

    Grig

    Points: 2
    Helpful Answer Positive Rating

usernam

Full Member level 5
Joined
Mar 9, 2004
Messages
268
Helped
18
Reputation
36
Reaction score
2
Trophy points
1,298
Activity points
2,081
Entropy -What is it?

Quoted from the US DOE Fundamentals Handbook

Entropy (S) is a property of a substance, as are pressure, temperature, volume, and enthalpy.
Because entropy is a property, changes in it can be determined by knowing the initial and final
conditions of a substance. Entropy quantifies the energy of a substance that is no longer
available to perform useful work. Because entropy tells so much about the usefulness of an
amount of heat transferred in performing work, the steam tables include values of specific
entropy (s = S/m) as part of the information tabulated. Entropy is sometimes referred to as a
measure of the inability to do work for a given heat transferred.
 

    Grig

    Points: 2
    Helpful Answer Positive Rating

Grig

Full Member level 4
Joined
Sep 3, 2004
Messages
222
Helped
36
Reputation
72
Reaction score
12
Trophy points
1,298
Activity points
1,469
Re: Entropy -What is it?

This are OK

But for ex. i know how to use term energy to solve some problem

is the some example when entropy can be used to predict somthing numericaly?

Regards
 

Tinamuline

Member level 2
Joined
Sep 19, 2005
Messages
50
Helped
2
Reputation
4
Reaction score
0
Trophy points
1,286
Activity points
1,790
Re: Entropy -What is it?

When we talk about entropy, we usually talk about the change of entropy. Entropy differs from energy in that it does not obey a conseration law. The energy of a closed system is conserved; it always remains constant. However for irreversible processes, the entropy of a closed system always increases. Because of this property, the change in entropy is sometimes called 'the arrow of time'. For example, we associate the egg of our opening photograph, breaking irreversibly as it drops into a cup, with the forward direction of time and with the increase in entropy. The backward direction of time (a videotape run backward) would correspond to the broken egg re-forming into a whole egg and rising into the air. This backward process, which would result in an entropy decrease, never happens.
 

    Grig

    Points: 2
    Helpful Answer Positive Rating

Grig

Full Member level 4
Joined
Sep 3, 2004
Messages
222
Helped
36
Reputation
72
Reaction score
12
Trophy points
1,298
Activity points
1,469
Re: Entropy -What is it?

OK Thanks

And.... How Entropy can be used for electronic circuit analyzis?
Give me some example Please

Regards
 

treehugger

Member level 2
Joined
Oct 23, 2005
Messages
44
Helped
3
Reputation
6
Reaction score
0
Trophy points
1,286
Activity points
1,705
Re: Entropy -What is it?

Grig said:
OK Thanks

And.... How Entropy can be used for electronic circuit analyzis?
Give me some example Please

Regards

well, think of two identical transistors biased with same current source, I. now you would expect that the current flowing through each transistor is I/2. this is almost obvious since i said that transistors are identical, but why the combination I/3 and 2/3I is not realistic? i think that the system prefers combination I/2,I/2 just because entropy should increase. when you flip 100 coins, you expect 50 tails and 50 heads; also thats the combination which maximises the entropy. when you have current I, you expect its divided into I/2 and I/2, thats the combination maximising the entropy. this is same as saying that when you have 2 rooms with temperature 3T and T, they will reach the equilibrium at 2T.

does that answer your question?
 

    Grig

    Points: 2
    Helpful Answer Positive Rating

Nimer

Full Member level 3
Joined
Apr 26, 2005
Messages
180
Helped
26
Reputation
52
Reaction score
8
Trophy points
1,298
Location
PALISTINE
Activity points
2,907
Re: Entropy -What is it?

hi
as far as i know entropy is ameasure of chaos or uncertaintyin asystemand its alaways positive and increasing in a real system sych as the universe .

entropy is used to estimate the amount of information or knowledge in a system see abarmson book " information theory and coding "


best regards
 

saabiaan

Newbie level 3
Joined
Jan 20, 2006
Messages
4
Helped
1
Reputation
2
Reaction score
0
Trophy points
1,281
Location
New Delhi,India
Activity points
1,367
Entropy -What is it?

meaure ov disorderness ov system.....is called entropy...entropy of universe is always increasing...coz ov disorder in daily life through various means.....
 

jallem

Advanced Member level 1
Joined
Jul 23, 2005
Messages
452
Helped
73
Reputation
146
Reaction score
10
Trophy points
1,298
Activity points
4,993
Entropy -What is it?

wait a minute......Grig is asking for Thermodynamic entropy!!!!!!........some aswer are for the information
entropy
 

electronics_kumar

Advanced Member level 2
Joined
Nov 29, 2004
Messages
659
Helped
34
Reputation
68
Reaction score
9
Trophy points
1,298
Location
Tamilnadu
Activity points
5,552
Entropy -What is it?

it is a measure of amount of uncertainity present in a information...it's maximum value is one ....
 

nonlinear

Junior Member level 3
Joined
Jan 29, 2007
Messages
25
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,281
Activity points
1,435
Entropy -What is it?

entropy is a measure of disorderness in the system. Greater the entropy more will be the randomness or disorderness in the system. In thermodynamics, when a system is given heat energy, motion of the constituting molecules increases and we say that it's entropy has incresed.
 

SSP

Junior Member level 3
Joined
Mar 3, 2007
Messages
27
Helped
2
Reputation
4
Reaction score
1
Trophy points
1,283
Location
Mumbai
Activity points
1,464
Re: Entropy -What is it?

In the digital sense,entropy is the average information.
If there is a source emitting 'n' different symbols per second,then the entropy is calculated as (information/n).
informaton is directly proportional to the probability of occurence of the symbol.
More the probabaility,less the information and vice versa.
 

Mr.Cool

Advanced Member level 2
Joined
Jun 20, 2001
Messages
664
Helped
87
Reputation
178
Reaction score
60
Trophy points
1,308
Activity points
7,111
Re: Entropy -What is it?

keep in mind that entropy acts differently when on the quantum scale (as does most everything else).

comments welcome!

Mr.Cool
 

hn1

Advanced Member level 4
Joined
Oct 28, 2006
Messages
105
Helped
5
Reputation
10
Reaction score
0
Trophy points
1,296
Activity points
1,872
Entropy -What is it?

Entropy has different definition in different field from chemistry to communication and physics
so the main point of that is the randomness of the system
 

Mr.Cool

Advanced Member level 2
Joined
Jun 20, 2001
Messages
664
Helped
87
Reputation
178
Reaction score
60
Trophy points
1,308
Activity points
7,111
Entropy -What is it?

ahhh.. but the 2nd law of thermodynamics makes it very clear what the definition is.

but it applies only statistically!!

so it is more of a "probability" that things will work this way. the probability is based on the relative size of the subject in question.

if you deal in quantum state, then the probability that the 2nd law of thermodynamics is correct is 99.9999% WRONG.

so... not quite a "law" any more is it? its more a "2nd description of thermodynamics that applies statistically to things on a big scale only".

by the way, "Ohms Law" also falls into this category. it is more of a description that applies when you study circuits that are specifically designed to abide by ohm's law! ha.. no wonder it works so well ;P

Mr.Cool
 

eng_mahmoud87

Newbie level 6
Joined
Jun 8, 2007
Messages
14
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,281
Activity points
1,371
Re: Entropy -What is it?

to know mpre information about entropy in details u should read many thing about thermodynamic
 

eogotenks

Member level 2
Joined
Nov 14, 2005
Messages
43
Helped
3
Reputation
6
Reaction score
0
Trophy points
1,286
Location
Dominican Republic
Activity points
1,768
Re: Entropy -What is it?

Greek: εν (en=inside) + verb: τρέπω (trepo= to chase, escape, rotate, turn)) in thermodynamics is central to the second law of thermodynamics, which deals with physical processes and whether they occur spontaneously. Spontaneous changes occur with an increase in entropy. Spontaneous changes tend to smooth out differences in temperature, pressure, density, and chemical potential that may exist in a system, and entropy is thus a measure of how far this smoothing-out process has progressed. In contrast, the first law of thermodynamics deals with the concept of energy, which is conserved. Entropy change has often been defined as a change to a more disordered state at a molecular level. In recent years, entropy has been interpreted in terms of the "dispersal" of energy. Entropy is an extensive state function that accounts for the effects of irreversibility in thermodynamic systems.
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Top