The sad truth is that there is no way, only the context could tell it.
Anyway the zero is more like an ellipse, while 'o' should be more
or less a like a circle;
for words and numbers there should be no problem, as you will always
read 'o' in a word and zero in a number, no matter how badly they are
written.
if you are using codes, than your code should accept either zeroes or Os,
never both, so it is the plan for the code that will avoid disaster, like
sending money to a wrong bank account;
the same code-problem obviously exists for 1 and 'i', etc.
In 'ancient times' (w.r.t. computer history), zeroes were printed with a slash inside, exactly for this problem; in contexts were ambiguity
poses a real danger, keeping to this habit even in handwriting
might be a solution.