naturism


From The Collaborative International Dictionary of English v.0.48:

Naturism \Na"tur*ism\, n. (Med.)
   The belief or doctrine that attributes everything to nature
   as a sanative agent.
   [1913 Webster]
Feedback Form