Member Login - User Registration - Set as Homepage - Add to Favorites - Website Map Artificial Intelligence Full Course _ Artificial Intelligence Tutorial for Beginners _ Edureka - Ep21!

Artificial Intelligence Full Course _ Artificial Intelligence Tutorial for Beginners _ Edureka - Ep21

Time: 2025-07-11 11:17:26 Source: Codora.ai Author: html Reading: 221 times
full course video I haveto cover all the topics and tech revolution 2025it is hardfor me to make you understand in-depthof each topic right so I'll leave acouple of links in the description boxyou can watch those videos as well rightmake sure you check out the probabilityand statistics video all right so nowlet's move on and look at our nextalgorithm which is the K nearestneighboralgorithm now KNN which basically standsfor K nearest neighbor is again asupervised classification algorithm thatclassifies a new new data point into thetarget class or the output classdepending on the features of itsneighboring data points right that's whyit's called K nearest neighbor so let'stry to understand KNN with a smallanalogy okay let's say that we want amachine to distinguish between theimages of cats and dogs so to do this wemust input a data set of cat and dogimages and we have to train a model todetect the animal based on certainfeatures okay for example features suchas pointy ears can be used to identifycats right similarly we can identifydogs based on their long years so afterstudying the data set during thetraining phase when a new image is givento the model the KE and algorithm willclassify it into either cats or dogsdepending on the similarity in theirfeatures okay let's say that a new imagehas pointy ears it would classify thatimage as cat because it is similar tothe cat image es because it's similar toits neighbors in this manner the KNNalgorithm classifies the data pointsbased on how similar they are to theirneighboring data points right so this isa small example we'll discuss more aboutit in the further slides now let me tellyou a couple of features of KNNalgorithm so first of all we know thatit is a supervised learning algorithm ituses labeled input data set to predictthe output of the data points then it isalso one of the simplest machinelearning algorithms and it can be easilyimplemented for a varied set of problemsanother feature is that it isnon-parametric meaning that it does nottake in any assumptions for exampleknife bias is a parametric model becauseit assumes that all the independentvariables are in no way related to eachother right it has assumptions about themodel K's neighbor has no suchassumptions that's why it's considered anon-parametric model another feature isthat it is a lazy algorithmnow lazy algorithm basically is anyalgorithm that memorizes the trainingset instead of learning a discriminativefunction from the training data okay noweven though knnn is mainly aclassification algorithm it can also beused for regression cases right so KNNis actually both a classification and aregression algorithm but mostly you'llsee that it'll be used only forclassification problems the mostimportant Fe feature about Karisneighbor is that it's based on featuresimilarity with its neighboring datapoints all right you'll understand thisin the example that I'm going to tellyou now in this image we have twoclasses of data right we have class Awhich is squares and Class B which aretriangles now the problem statement isto assign the new input data point toone of the two classes by using the KNNalgorithm so the first step in the KNNalgorithm is to define the value of Kbut what does the K in the KNN algorithmstand for now the K stands for thenumber of nearest neighbors and that'swhy it's got the name k nearestneighbors right now in this image I'vedefined the value of K as three thismeans that the algorithm will considerthe three neighbors that are closest tothe new data point in order to decidethe class of the new data point so theclosest between the data point iscalculated by using measure such as ukandistance and Manhattan distance whichI'll be explaining in a while so at K isequal to 3 the neighbors include twosquares and one triangle so if I were toclassify the new data point based on kequal to 3 then it should be assigned toclass a correct it should be assigned tosquares but what if the K value is setto seven here I'm basically telling myalgorithm to look for the seven nearestneighbors and classify the new datapoint into the class it is most similarto right so at k equal to 7 theneighbors include three squares and fourtriangles so if I were to classify thenew data point based on k equal to 7then it would be assigned to class BSince U majority of its neighbors arefrom class B now this is where a lot ofus get confused right so how do we knowwhich K value is the most suitable forK's neighbor now there are a couple ofmethods used to calculate the K valueone of them is known as the elbow elowmethod right we'll be discussing theelbow method in the upcoming slides okayso for now let me just show you uh themeasures that are involved behind K&Nokay there's very simple math behind thekorus neighbor algorithm so I'll bediscussing the ukan distance with younow in this figure we have to measurethe distance between P1 and P2 by usingukan distance I'm sure a lot of youalready know what ukan distance is it issomething that we learned in eth or 10thgrade I'm not sure right so all you'redoing is you're extracting X1 so theformula is basically X2 - X1 the whole s+ Y2 - y1 the whole square and the rootof that is your ukian distance rightit's as simple as that so ukian distanceis used as a measure to check thecloseness of data points right sobasically KNN uses the ukan distance tocheck the closeness of a new data pointwith its neighbors so guys it's assimple as that Cann makes use of simplemeasures in order to solve very complexproblems okay this is one of the reasonswhy KNN is such a commonly usedalgorithm coming to support Vectormachines now this is our last algorithmunder classification algorithms now guysdon't get paranoid because of the namesupport Vector machine actually is oneof the simplest algorithms in supervisedlearning okay it is basically used toclassify data into different classesright it's a classification algorithmnow unlike most algorithms spvm makesuse of something known as a hyper planewhich acts like a decision boundarybetween the separate classes okay nowsvm can

(Editor in charge: code)

Related content
  • Solidity, Blockchain, and Smart Contract Course – Beginner to Expert Python Tutorial - Ep77
  • Web3 & Crypto Will Change The World As We Know It! Yat Siu - Ep1
  • Solana Developer Bootcamp 2024 - Learn Blockchain and Full Stack Web3 Development - Projects 1-9 - Ep21
  • The Basics of Web3 - Ep5
  • Learn Blockchain, Solidity, and Full Stack Web3 Development with JavaScript – 32-Hour Course - Ep279
  • Getting started with Web3 in 2025 _ Deep Dive Explainer - Ep7
  • The Basics of Web3 - Ep3
  • Getting started with Web3 in 2025 _ Deep Dive Explainer - Ep3
Recommended content
  • XRP Ripple LIVE Senate Brad Garlinghouse - FROM WALL STREET TO WEB3 - Martyn Lucas Investor - Ep24
  • Learn Blockchain, Solidity, and Full Stack Web3 Development with JavaScript – 32-Hour Course - Ep286
  • Solana Developer Bootcamp 2024 - Learn Blockchain and Full Stack Web3 Development - Projects 1-9 - Ep81
  • Learn Blockchain, Solidity, and Full Stack Web3 Development with JavaScript – 32-Hour Course - Ep125
  • Solidity, Blockchain, and Smart Contract Course – Beginner to Expert Python Tutorial - Ep100
  • Solana Developer Bootcamp 2024 - Learn Blockchain and Full Stack Web3 Development - Projects 1-9 - Ep17