How do neurons work in machine learning?

If I have the columns that mean I’m supposed to have three inputs to this That means each neuron in import three presents one dimension All right Good And each neuron in the output three presents my target column So let the same target I 50 and one So this neuron is supposed to detect zero This neuron is supposed to detect one That means if I give a combination that I said a combination of 000 that means when all three of them a zero it is going to highlight that this is your classified out Okay so this is important out This is how we decide input enough Now comes the hidden part of it.

Now what exactly is a hidden life is Um randomly I would not surround normally but it is our choice that how many’s neurons we want here How many less Some Some of them have 10 layers Some of them have Tulay is four less whatever These are nothing but the same neurons But the only thing they’re connected in some of the other differently So we got three less input-output inherent Any shoes under this Simple Don’t worry about the calculations We’ll see But so far so good All three legs Okay.

You know we talk about something called activation function or what is an activation If you observe this person you will observe that not always all the neurons are on Did you see that some of the neurons are off and in output, Other neurons are off only one neuron on So you might be wondering OK how this happens This is the job off activation function acts as a switch Now let me define when activation function right now to see we will design a function which says if my number is greater than one then I will activate that I will say past them And But for the next new Ron.

If the number is less than one I will say Switch it off now in my import If I give several ones points what’s gonna happen This is your function You will choose this one and you will say that that current neuron is active past the number for the next new allow the number two Pastor these are called activation functions This is a simple one Have shown you, okay Let us dive deep into this activation function theory So there are so many functions available like this will go one by one Before that let me show you what happens in the neural neuron So imagine this is one you know this one and this new Ron.

I have represented here, for example, any of the neurons all of the neurons have seen Kind off Look what am I doing And giving some input will you This is my input X zero There is a V eight and if you ask me water this weight I’ve been self randomly chosen number Okay During your forward propagation when you’re your network is built with randomly assigned some number Sorry In the end and we put some number Yeah that random number gets multiplied with your will become w zero exits at W zero eg zero from the other neurons we get w and X one w two extra So these are imports from other neurons All right All three of them are coming and converging here So if you want to see an example just have a look Yeah.

If I see we’re talking about say this particular neuron harmony inputs are coming 1 20 So this is my ex zero Excellent Extra All right So you can imagine that tree No Inside this cell inside this new Ron what do we have We have an activation function he called This is an activation function here F And also there is one grand um number called by us which we have to add it with our addition off all three of them So if you observe here what is this W 00 plus the blue one x one plus W two x two plus some random number called by asking factor All right Now if this particular operation is greater than the certain value that it is allowed to pass If it is if it does not satisfy my function it is not allowed to bus Okay now let me see Let me show you what are these functions So if you observe your what looks like very complicated activation functions but actually they are very easy.

We’ll start with the easy function This one have a look at this unless and until your output is greater than zero it will be ordering It is zero This quiet steps function 2nd 1 is sick Mortal function Have you done this earlier in any of the algorithms Can you recollect us every done and sigmoid in any whales Logistic regression tradition Largest Dick Gregory current lager And one more than garners here SPM corners You remember one of the towns was tangential sig Mortal Connell Yes What does it mean If your input is between say this face and this race it will continuously increase If you are input is less than this value will be zero And if your input is more than this it will be a static value one.

If you expand this a little bit more Okay on one more thing This is not really looked I don’t know why I amended internally Here this is wrong is a signal And if you explained this little more on the death part of it But you have to cross through zero that becomes their tangential from all right Third parties coiled Really Okay So what Israeli basically rectify linear unit So if it is less than the Lord is permanently zero if it is more than zero it is Ah Lini Early in clear Cinco This is one of the most popular activation function we have used We are using actually so depending on what type of inputs you’re having You can use one of these down the line when I dreamed When I will show you guys out right Case studies.

How to decide your networks at the time we will try to juice all the kinds of stuff but already Yeah and there is one more important activation function called Soft Max whenever you have a category killer But now we just saw that there are two types of possibilities over here The output could be categorical and our put could be the aggressive correct same thing we ordered in machine learning Also if your output a static medical their defiled function activation function you will use only one output layer is called soft Max It is a categorical activation function If in case you have no categories you have regressed outputs Regression is an output, In that case, you can use any of these.

I’ll be a kid all of you you know Yeah So how what are these functions and all we will see when I show you the stuff But for now I’ll be clear We have sigmoid and Ray Lou Antisocial is that some of the most important months And why do we need it Just to manipulate our number and let it pass for That’s it All right No going on Do so This is how overall on a very high level in your network works Now let’s try to simulate one So I have similar to the new networking Excel If I now let me show you water.

It’s now Please we cannot do this on a higher level but a very simple new network you can clean it Now Observe minute So as an input or do I have got some alphabets from a Does it not Please remember computer does not understand what is a What is it What we do is we do somewhat cheating at the back And we tell the country so that if this is the number it should be displayed as this figure If I get this as a number it should be displayed It this figure everybody agrees to that Yeah binary logic and olive ears started in the comfort of science And this is what it is the Same thing applies in neural networks Also see what I was saying What I’ve done is after Kony does it.

Leave a Comment