Coursera Answers

Machine Learning With Python Week 3 Quiz Answer | Classification

Machine Learning With Python Week 3 Quiz Answer  Classification

Machine Learning With Python Week 3 Quiz Answer | Classification


Classification


Question 1)

which one IS NOT a sample of classification problem?


  • To predict the category to which a customer beings to.
  • To predict whether a customer switches to another provider/brand.
  • To predict the amount of money a customer will spend in one year.
  • To predict whether a customer responds to a particular advertising campaign or not.



Question 2)

which of the following statements are TRUE about Logistic Regression? (select all that apply)


  • Logistic regression can be used both for binary classification and multi-class classification
  • Logistic regression is analogous to linear regression but takes a categorical/discrete target field instead of a numeric one.
  • ln logistic regression, the dependent variable is binary.



Question 3)

Which of the following examples is/are a sample application of Logistic Regression? (select all that apply)


  • Likelihood of a homeowner defaulting on a mortgage.
  • Estimating the blood pressure of a patient based on her symptoms and biographical data.
  • The probability that a person has a heart attack within a specified time period using person’s age and sex.
  • Customer’s propensity to purchase a product or halt a subscription in marketing applications.




Question 4)

Which one is TRUE about the kNN algorithm?


  • kNN algorithm can be used to estimate values for a continuous target.
  • kNN is a classification algorithm that takes a bunch of unlabelled points and uses them to learn how to label other points.



Question 5)

What is “information gain” in decision trees?


  • It is the amount of information disorder, or the amount of randomness in each node.
  • It is the information that can decrease the level of certainty after splitting in each node.
  • It is the entropy of a tree before split minus weighted entropy after split by an attribute.