The definition @Lajos Arpad gave of Theta is completely wrong without the additional context that we are talking about average-time complexities.
Theta indeed means that its a tight bound, as he said. However, that does not mean it is an "average" bound. It means that Theta gives both an (asymptotic) upperbound and an (asymptotic) lowerbound. Basically, if $f(n)$ is the maximum number of elementary operations the algorithm does for an input of size n, then we say:
- $f(n) = O(g(n))$ if there is some constant $C$ such that $f(n) \leq C \dot g(n)$.
- $f(n) = \Theta(g(n))$ if both $f(n) = O(g(n))$ and $g(n) = O(f(n))$.
For instance, an algorithm that does something like:
def f(n):
for i in range(n):
for j in range(n):
#something
for k in range(n):
#something else
where `something` takes between $1$ and $3$ operations and `something else$ takes between $1$ and $2$ operations will take in total at most $2n^2 + 3 n$ operation, which is smaller than $5 n^2$, i.e $f(n) = O(n^2)$.
It is also true that $f(n) = O(n^3)$.
Since we also have $n^2 = O(f(n))$, because $n^2 \leq n^2 + n$, we must have $f(n) = \Theta(n^2)$, i.e the algorithm described by $f$ takes asymptotically exactly a constant multiple of $n^2$ steps to complete its task.
HOWEVER, if we consider instead average-time complexities (instead of worst-case complexities, as above) then the above example is true. The corresponding definitions are the same but with $f(n)$ being defined as the average number of operations the algorithm does for an input of size $n$.