PhonePe Interview Question | Curse of Dimensionality

Question

What is the curse of dimensionality? Why do we care about it?

in progress 0
Dhruv2301 4 years 1 Answer 628 views Great Grand Master 0

Answer ( 1 )

  1. In machine learning, to catch useful indicators and obtain a more accurate result, we tend to add as many features as possible at first. However, after a certain point, the performance of the model will decrease with the increasing number of elements. This phenomenon is often referred to as “The Curse of Dimensionality.”
    The curse of dimensionality occurs because the sample density decreases exponentially with the increase of the dimensionality. When we keep adding features without increasing the number of training samples as well, the dimensionality of the feature space grows and becomes sparser and sparser. Due to this sparsity, it becomes much easier to find a “perfect” solution for the machine learning model which highly likely leads to overfitting.
    Overfitting happens when the model corresponds too closely to a particular set of data and doesn’t generalize well. An overfitted model would work too well on the training dataset so that it fails on future data and makes the prediction unreliable.
    So how could we overcome the curse of dimensionality and avoid overfitting especially when we have many features and comparatively few training samples? One popular approach is dimensionality reduction.
    Dimensionality reduction is the process of reducing the dimensionality of the feature space with consideration by obtaining a set of principal features. Dimensionality reduction can be further broken into feature selection and feature extraction.
    Feature selection tries to select a subset of the original features for use in the machine learning model. In this way, we could remove redundant and irrelevant features without incurring much loss of information.
    Feature extraction is also called feature projection. Whereas feature selection returns a subset of the original features, feature extraction creates new features by projecting the data in the high-dimensional space to a space of fewer dimensions. This approach can also derive informative and non-redundant features.
    We can use feature selection and feature extraction together.

Leave an answer

Browse
Browse