Talk:Dimensionality reduction

From Wikipedia, the free encyclopedia
Latest comment: 17 June by JWBTH in topic Plain language introduction needed?
Jump to navigation Jump to search

Template:WikiProject banner shell Template:Annual readership

K-NN

Why is there so much emphasis on using K-NN after PCA? — Preceding unsigned comment added by 86.21.183.193 (talk) 06:14, 31 March 2015 (UTC)Reply

Untitled

This article was listed on Wikipedia:Votes for deletion, and the consensus was keep: see Wikipedia:Votes for deletion/Dimensionality reduction

Article is biased, update required

Please have a look at JMLR Special Issue on Variable and Feature Selection. JKW 09:18, 23 April 2006 (UTC)Reply

I looked at your link, and it simply appears to be a list of papers. Not sure how this relates to any bias in the article. Too bad you didn't use more words back there in 2006. David Spector (talk) 11:13, 9 November 2021 (UTC)Reply

Dimension Reduction, not Dimensionality Reduction

Why is this called Dimensionality Reduction? It should be called Dimension Reduction. Calling it "Dimensionality Reduction" is like overconjucationalizifying a word. — Preceding unsigned comment added by 2620:0:1000:1502:26BE:5FF:FE1D:BCA1 (talk) 17:34, 11 July 2013 (UTC)Reply

Dimensional Reduction or Reducing Dimensions

Dimensional is an adjective modifying the noun "reduction". "Dimension" and "dimensionality" are both nouns. Similarly, one says "blue truck", where "blue" is the adjective modifying the noun "truck." "Blueness" is a noun-ified form of the word, which we could use as "reducing the blueness of the truck". Saying "blueness truck" sounds wrong at best. — Preceding unsigned comment added by 2601:545:C102:A56A:79B7:6583:CB24:A50F (talk) 19:24, 30 March 2018 (UTC)Reply

Agree David Spector (talk) 11:10, 9 November 2021 (UTC)Reply

External links modified

Hello fellow Wikipedians,

I have just modified one external link on Dimensionality reduction. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

Template:Sourcecheck

Cheers.—InternetArchiveBot (Report bug) 20:04, 10 September 2017 (UTC)Reply

Nearest Shrunken Centroids

Nearest Shrunken Centroids ((Tibshirani et. al 2002), not just Nearest centroid classifier) have been successfully used when dimensionality is high and training examples are low.

The article needs to explain "training" better. To a typical intelligent person, "training" has nothing to do with data: if I said, "the data is 1, 2, and 3", how and why is this data to be trained?
The article does say, "The training of deep encoders is typically performed using a greedy layer-wise pre-training (e.g., using a stack of restricted Boltzmann machines) that is followed by a finetuning stage based on backpropagation," without defining "deep encoders", "Boltzmann machines", or "training". I don't think that WP articles are meant to be understood only by experts in their field. I've taken college physics and mathematics, and the only word familiar to me is "backpropagation", which is a term I recall from learning about neural networks. Does "training" have to do with data, or with neural networks that recognize data? David Spector (talk) 19:33, 17 December 2022 (UTC)Reply

Needs definition and examples of data dimension

The lead paragraph refers principally to high and low dimension data, without any definition or examples. Even though I am a retired software engineer with 40 years of experience, I can only guess at what that might mean. A clear definition and some simple examples right up front would do much to make this article more widely useful, in my opinion. David Spector (talk) 11:09, 9 November 2021 (UTC)Reply

To put this another way, the "dimensionality" of data does not have an obvious meaning, unless it refers to the use of indices to address data stored in an array, since an array has a dimension, which is the number of its indices. But does data have an inherent dimensionality that has nothing to do with the fact that it may be stored in an array as opposed to some other method, such as a hologram? This article just confuses me, and seems to be written for someone already very familiar with its subject matter. David Spector (talk) 19:37, 17 December 2022 (UTC)Reply

Plain language introduction needed?

At least the introduction, if not more, should probably be tailored more to those unfamiliar with the topic.

- Jim Grisham (talk) 23:55, 4 September 2022 (UTC)Reply

Came here to say the same. "Dimensionality reduction, or dimension reduction, is the transformation of data from a high-dimensional space into a low-dimensional space" – which domain are you even talking about? What is space? Jack who built the house (talk) 08:04, 17 June 2025 (UTC)Reply