It has two meanings, basically.
A doctor who examines you might ask you to stick out your tongue, to look for any obvious illness, many of which show up on the tongue in some way.
I don't know how it is in your society, but in western societies, sticking out your tongue is an insult, a way of saying "I don't care WHAT you think!" or "You look like a fool, now". It is used more by children.