Clark, Saginaw Valley State University, Department of Electrical and Computer Engineering Relative entropy, also known as the Kullback-Leibler distance and information divergence, is a measure of the distance between two probability distributions.
What does KLD stand for?
KLD stands for Kullback-Leibler Distance
This definition appears somewhat frequently
See other definitions of KLD
Other Resources:
We have 7 other meanings of KLD in our Acronym Attic
- Abbreviation Database Surfer
- « Previous
- Next »
- Kimberly-Little Chute Public Library (Little Chute, WI)
- Khancoban Lakeside Caravan Resort (Khancoban, Australia)
- Kuala Lumpur Canoe Sports Association (Malaysia)
- Kuala Lumpur City Securities
- Klub Ludzi Ciekawych Wszystkiego (Polish: Happy Interesting People Club; radio program; Poland)
- Këshilli I Lartë I Drejtësisë (Albanian: High Council of Justice)
- Kilolitres per Day (measurement)
- Kinder, Lydenberg, Domini and Co., Inc. (Cambridge, MA)
- King Lincoln District (Columbus, OH)
- Kongres Liberalno Demokratyczny (Polish: Liberal Democratic Congress)
- Kullback-Leibler Divergence
- Karnataka Land Developers Association (India)
- Kawartha Lakes Dressage Association (horses; Canada)
- King Lane Dental Care (UK)
- Kumarangk Legal Defence Fund Inc. (Australia)
- Karnali Local Development Programme (1993-1998; Nepal)
- Kenya Livestock Development Programme (Kenya)
- Korean Linux Documentation Project (website to create and translate open source documents into Korean)
- Kroy Label Designer Software
- Karhunen-Loeve Expansion
Samples in periodicals archive:
The traditional metrics we have used are the normalized Kullback-Leibler distance (NKLD) and the normalized Levenstein distance (NLD).