Linear predictor function, the Glossary
In statistics and in machine learning, a linear predictor function is a linear function (linear combination) of a set of coefficients and explanatory variables (independent variables), whose value is used to predict the outcome of a dependent variable.[1]
Table of Contents
48 relations: Basis function, Binary data, Blood pressure, Blood type, Cambridge University Press, Categorical variable, Continuous or discrete variable, Dependent and independent variables, Design matrix, Dot product, Dummy variable (statistics), Factor analysis, Gaussian function, Identifiability, Invertible matrix, K-nearest neighbors algorithm, Least squares, Linear classifier, Linear combination, Linear discriminant analysis, Linear function, Linear model, Linear regression, Logistic regression, Machine learning, Machine Learning (journal), Matrix multiplication, Moore–Penrose inverse, Multicollinearity, Nearest-neighbor interpolation, Normal distribution, Perceptron, Polynomial, Polynomial regression, Principal component analysis, Radial basis function, Random variable, Rank (linear algebra), Regularization (mathematics), Row and column vectors, Scalar (mathematics), Singular value decomposition, Statistical data type, Statistics, Support vector machine, Transpose, Value (mathematics), Y-intercept.
Basis function
In mathematics, a basis function is an element of a particular basis for a function space.
See Linear predictor function and Basis function
Binary data
Binary data is data whose unit can take on only two possible states.
See Linear predictor function and Binary data
Blood pressure
Blood pressure (BP) is the pressure of circulating blood against the walls of blood vessels.
See Linear predictor function and Blood pressure
Blood type
A blood type (also known as a blood group) is a classification of blood, based on the presence and absence of antibodies and inherited antigenic substances on the surface of red blood cells (RBCs).
See Linear predictor function and Blood type
Cambridge University Press
Cambridge University Press is the university press of the University of Cambridge.
See Linear predictor function and Cambridge University Press
Categorical variable
In statistics, a categorical variable (also called qualitative variable) is a variable that can take on one of a limited, and usually fixed, number of possible values, assigning each individual or other unit of observation to a particular group or nominal category on the basis of some qualitative property.
See Linear predictor function and Categorical variable
Continuous or discrete variable
In mathematics and statistics, a quantitative variable may be continuous or discrete if they are typically obtained by measuring or counting, respectively.
See Linear predictor function and Continuous or discrete variable
Dependent and independent variables
A variable is considered dependent if it depends on an independent variable. Linear predictor function and dependent and independent variables are regression analysis.
See Linear predictor function and Dependent and independent variables
Design matrix
In statistics and in particular in regression analysis, a design matrix, also known as model matrix or regressor matrix and often denoted by X, is a matrix of values of explanatory variables of a set of objects. Linear predictor function and design matrix are regression analysis.
See Linear predictor function and Design matrix
Dot product
In mathematics, the dot product or scalar productThe term scalar product means literally "product with a scalar as a result".
See Linear predictor function and Dot product
Dummy variable (statistics)
In regression analysis, a dummy variable (also known as indicator variable or just dummy) is one that takes a binary value (0 or 1) to indicate the absence or presence of some categorical effect that may be expected to shift the outcome.
See Linear predictor function and Dummy variable (statistics)
Factor analysis
Factor analysis is a statistical method used to describe variability among observed, correlated variables in terms of a potentially lower number of unobserved variables called factors.
See Linear predictor function and Factor analysis
Gaussian function
In mathematics, a Gaussian function, often simply referred to as a Gaussian, is a function of the base form f(x).
See Linear predictor function and Gaussian function
Identifiability
In statistics, identifiability is a property which a model must satisfy for precise inference to be possible.
See Linear predictor function and Identifiability
Invertible matrix
In linear algebra, an -by- square matrix is called invertible (also nonsingular, nondegenerate or rarely regular) if there exists an -by- square matrix such that\mathbf.
See Linear predictor function and Invertible matrix
K-nearest neighbors algorithm
In statistics, the k-nearest neighbors algorithm (k-NN) is a non-parametric supervised learning method first developed by Evelyn Fix and Joseph Hodges in 1951, and later expanded by Thomas Cover.
See Linear predictor function and K-nearest neighbors algorithm
Least squares
The method of least squares is a parameter estimation method in regression analysis based on minimizing the sum of the squares of the residuals (a residual being the difference between an observed value and the fitted value provided by a model) made in the results of each individual equation.
See Linear predictor function and Least squares
Linear classifier
In the field of machine learning, the goal of statistical classification is to use an object's characteristics to identify which class (or group) it belongs to.
See Linear predictor function and Linear classifier
Linear combination
In mathematics, a linear combination is an expression constructed from a set of terms by multiplying each term by a constant and adding the results (e.g. a linear combination of x and y would be any expression of the form ax + by, where a and b are constants).
See Linear predictor function and Linear combination
Linear discriminant analysis
Linear discriminant analysis (LDA), normal discriminant analysis (NDA), or discriminant function analysis is a generalization of Fisher's linear discriminant, a method used in statistics and other fields, to find a linear combination of features that characterizes or separates two or more classes of objects or events.
See Linear predictor function and Linear discriminant analysis
Linear function
In mathematics, the term linear function refers to two distinct but related notions.
See Linear predictor function and Linear function
Linear model
In statistics, the term linear model refers to any model which assumes linearity in the system.
See Linear predictor function and Linear model
Linear regression
In statistics, linear regression is a statistical model which estimates the linear relationship between a scalar response and one or more explanatory variables (also known as dependent and independent variables).
See Linear predictor function and Linear regression
Logistic regression
In statistics, the logistic model (or logit model) is a statistical model that models the log-odds of an event as a linear combination of one or more independent variables.
See Linear predictor function and Logistic regression
Machine learning
Machine learning (ML) is a field of study in artificial intelligence concerned with the development and study of statistical algorithms that can learn from data and generalize to unseen data and thus perform tasks without explicit instructions.
See Linear predictor function and Machine learning
Machine Learning (journal)
Machine Learning is a peer-reviewed scientific journal, published since 1986. Linear predictor function and Machine Learning (journal) are machine learning.
See Linear predictor function and Machine Learning (journal)
Matrix multiplication
In mathematics, particularly in linear algebra, matrix multiplication is a binary operation that produces a matrix from two matrices.
See Linear predictor function and Matrix multiplication
Moore–Penrose inverse
In mathematics, and in particular linear algebra, the Moore–Penrose inverse of a matrix, often called the pseudoinverse, is the most widely known generalization of the inverse matrix.
See Linear predictor function and Moore–Penrose inverse
Multicollinearity
In statistics, multicollinearity or collinearity is a situation where the predictors in a regression model are linearly dependent. Linear predictor function and multicollinearity are regression analysis.
See Linear predictor function and Multicollinearity
Nearest-neighbor interpolation
Nearest-neighbor interpolation (also known as proximal interpolation or, in some contexts, point sampling) is a simple method of multivariate interpolation in one or more dimensions.
See Linear predictor function and Nearest-neighbor interpolation
Normal distribution
In probability theory and statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable.
See Linear predictor function and Normal distribution
Perceptron
In machine learning, the perceptron (or McCulloch–Pitts neuron) is an algorithm for supervised learning of binary classifiers.
See Linear predictor function and Perceptron
Polynomial
In mathematics, a polynomial is a mathematical expression consisting of indeterminates (also called variables) and coefficients, that involves only the operations of addition, subtraction, multiplication and exponentiation to nonnegative integer powers, and has a finite number of terms.
See Linear predictor function and Polynomial
Polynomial regression
In statistics, polynomial regression is a form of regression analysis in which the relationship between the independent variable x and the dependent variable y is modeled as an nth degree polynomial in x. Polynomial regression fits a nonlinear relationship between the value of x and the corresponding conditional mean of y, denoted E(y |x). Linear predictor function and polynomial regression are regression analysis.
See Linear predictor function and Polynomial regression
Principal component analysis
Principal component analysis (PCA) is a linear dimensionality reduction technique with applications in exploratory data analysis, visualization and data preprocessing.
See Linear predictor function and Principal component analysis
Radial basis function
In mathematics a radial basis function (RBF) is a real-valued function \varphi whose value depends only on the distance between the input and some fixed point, either the origin, so that \varphi(\mathbf).
See Linear predictor function and Radial basis function
Random variable
A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events.
See Linear predictor function and Random variable
Rank (linear algebra)
In linear algebra, the rank of a matrix is the dimension of the vector space generated (or spanned) by its columns.
See Linear predictor function and Rank (linear algebra)
Regularization (mathematics)
In mathematics, statistics, finance, and computer science, particularly in machine learning and inverse problems, regularization is a process that changes the result answer to be "simpler".
See Linear predictor function and Regularization (mathematics)
Row and column vectors
In linear algebra, a column vector with elements is an m \times 1 matrix consisting of a single column of entries, for example, \boldsymbol.
See Linear predictor function and Row and column vectors
Scalar (mathematics)
A scalar is an element of a field which is used to define a vector space.
See Linear predictor function and Scalar (mathematics)
Singular value decomposition
In linear algebra, the singular value decomposition (SVD) is a factorization of a real or complex matrix into a rotation, followed by a rescaling followed by another rotation.
See Linear predictor function and Singular value decomposition
Statistical data type
In statistics, groups of individual data points may be classified as belonging to any of various statistical data types, e.g. categorical ("red", "blue", "green"), real number, odd number (1,3,5) etc.
See Linear predictor function and Statistical data type
Statistics
Statistics (from German: Statistik, "description of a state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data.
See Linear predictor function and Statistics
Support vector machine
In machine learning, support vector machines (SVMs, also support vector networks) are supervised max-margin models with associated learning algorithms that analyze data for classification and regression analysis.
See Linear predictor function and Support vector machine
Transpose
In linear algebra, the transpose of a matrix is an operator which flips a matrix over its diagonal; that is, it switches the row and column indices of the matrix by producing another matrix, often denoted by (among other notations).
See Linear predictor function and Transpose
Value (mathematics)
In mathematics, value may refer to several, strongly related notions.
See Linear predictor function and Value (mathematics)
Y-intercept
In analytic geometry, using the common convention that the horizontal axis represents a variable x and the vertical axis represents a variable y, a y-intercept or vertical intercept is a point where the graph of a function or relation intersects the y-axis of the coordinate system.
See Linear predictor function and Y-intercept