Data science interview question: Linear regression

It is very necessary for a data scientist aspirants to have depth knowledge of machine learning. They are expected to have depth knowledge of different types of machine learning algorithms used in data science. Linear regression is one of the important topics in terms of machine learning algorithms.

In this blog, I am going to write the common Linear Regression questions that pop up in interviews.

Common Interview FAQ: Linear Regression

  1. What are the important assumptions of Linear regression?
  2. What is heteroscedasticity?
  3. What is the difference between R-square and adjusted R-square?
  4. How to find RMSE and MSE?
  5. What are the possible ways of improving the accuracy of a linear regression model?
  6. How to interpret a Q-Q plot in a Linear regression model?
  7. What is the significance of an F-test in a linear model?
  8. What are the disadvantages of the linear model?
  9. What is the linear regression?
  10. What is feature engineering? How do you apply it in the process of modeling?
  11. What is the use of regularisation? Explain L1 and L2 regularization.
  12. How to choose the value of the parameter learning rate (α)?
  13. How to choose the value of the regularisation parameter (λ)?
  14. Can we use linear regression for time series analysis?
  15. How does multicollinearity affect linear regression?
  16. What is the normal form (equation) of linear regression? When should it be preferred to the gradient descent method?
  17. You run your regression on different subsets of your data, and in each subset, the beta value for a certain variable varies wildly. What could be the issue here?
  18.  Your linear regression doesn’t run and communicates that there is an infinite number of best estimates for the regression coefficients. What could be wrong?
  19. How do you interpret the residual vs fitted value curve?
  20.  What is VIF? How do you calculate it?
  21. How do you know that linear regression is suitable for any given data?
  22. How is hypothesis testing used in linear regression?
  23. Explain gradient descent with respect to linear regression.
  24. How do you interpret a linear regression model?
  25.  What is robust regression?
  26. Which graphs are suggested to be observed before model fitting?
  27. What is the generalized linear model?
  28. Explain the bias-variance trade-off.
  29. How can learning curves help create a better model?
  30. Can you name a possible method of improving the accuracy of a linear regression model?
  31. What is the importance of the F-test in a linear model?
  32. What are the disadvantages of the linear regression model?
  33. What is the curse of dimensionality? Can you give an example?
  34. What is the role of Linear Regression in EDA (Exploratory Data Analysis)
  35. How do you know which regression model you should use?
  36. Why is G.D. computationally cheaper, compared to the normal equation?
  37. how are Ridge Regression and the LASSO different from ordinary least-squares, and why do you want to use it
  38. how do you evaluate the performance of your model? Techniques for training (k-fold cross-validation, nested cross-validation) and metrics (MSE, coefficient of determination)
  39. Given a regression setting with a binary response variable, what probability model should be used and why?
  40. Design a regression model to test the Law of Demand.
  41. Why normality is important in linear regression?
  42. How to estimate parameters in the linear regression model?
  43. What is the mean of residuals in LR. can you justify your answer?
  44. Correlation between age and height is 1.09.how do you interpret this?
  45. How many coefficients do we need to estimate in a simple linear regression?
  46. If two variables are correlated, is it necessary that they have a linear relationship?
  47. What is the difference between collinearity and correlation?
  48. Why do the residuals from a linear regression add up to 0?
  49. Is this still true if you fit a regression without intercept?
  50. Name a few types of regression you are familiar with? What are the differences?
  51. What are the downfalls of using too many or too few variables?
  52. How many variables should you use?
  53. What is Standard error?
About Mitra N Mishra 35 Articles
Mitra N Mishra is working as a full-stack data scientist.

Be the first to comment

Leave a Reply