... observe its color, the balls are only distinguishable with their colors. After observation you put the ball back into the bin.

What is the information entropy for one event (pick ball ones) in bits? Express your answer up to three decimal digits

# 5 red balls, three yellow balls, and four green balls in a bin. In each event you pick one ball and?

- Posted:
- 3+ months ago by Haya222
- Topics:
- yellow, red, green, color, ball, bins, event, back, pick, observation

Details:

## Answers (1)

Generally, entropy refers to disorder or uncertainty, and the definition of entropy used in information theory is directly analogous to the definition used in statistical thermodynamics. The concept of information entropy was introduced by Claude Shannon in his 1948 paper "A Mathematical Theory of Communication".

Entropy (information theory).

You're on your own with this one! For one thing, "in bits" and "up to three decimal digits" makes no sense.