Transcript 00:15 okay it's like to remind you we have an 00:18 assignment that's due on Friday you 00:22 should either bring the assignment to 00:24 class and Friday or find a way to fun oh 00:28 it's Monday not Friday MA okay 00:31 photos Friday all right so so bring it 00:35 in on Monday to class or find a way to 00:38 get it to one of the TAS sometime one 00:40 day if you don't bring it to class okay 00:43 all right any questions okay all right 00:52 we are going to take a look at another 00:56 data structure for single-ended project 00:58 used today 00:59 this one is a binomial heap and this 01:03 structure will allow us to perform the 01:05 same operations that are performed by a 01:08 leftist tree the difference here is 01:12 going to be that the asset the 01:14 asymptotic complexity or the worst-case 01:16 time per operation will not be terribly 01:19 good in fact one of the operations will 01:21 take order n time in the worst case let 01:23 me just put up the table okay 01:26 so we'll be doing the same set of three 01:29 operations insert remove men and melt 01:33 and compare two leftist trees the actual 01:39 complexity of the insert will be order 01:41 one while it is log n out here so that 01:43 looks good for removed when the actual 01:46 complexity is n that's bad compared to 01:48 the log n here and for a mouths the 01:52 actual complexity is one and that's 01:54 better than the login you get here so as 01:58 far as actual complexities go on one of 02:02 the operations is quite bad which is the 02:04 remove then on the plus side if you look 02:07 at the amortized complexity then the 02:10 amortize complexed you the remove min is 02:12 log n while the amortized complexity for 02:16 the remaining two operations remains the 02:18 same which is order one so from the 02:20 amortized complexity point of view we 02:24 get some improvement in that the end 02:27 certain meds are better and the remove 02:32 men has the same asymptotic amortized 02:34 complexity so if you're performing a 02:41 sequence of operations where you have a 02:43 lot of inserts and meds and not as many 02:48 remove mends you might expect the 02:49 binomial heap to work well of course if 02:53 you don't have any remove mends you 02:55 don't need any of these structures if 02:56 you're just doing inserts and melts all 02:59 you need to do is for each priority 03:01 queue keep track of the fellow that has 03:03 the men priority you just discard all 03:07 the others okay now we're going to be 03:10 looking at the binomial heap not really 03:13 because the binomial heap is an 03:15 interesting structure it's owned right 03:17 from the application point of view we're 03:19 going to be looking at it as a stepping 03:21 stone to arrive at a structure we're 03:23 going to look at one day two lectures 03:25 from now which is the Fibonacci heap and 03:28 that structure will allow us to perform 03:30 a couple of additional operations 03:33 efficiently one in particular is going 03:36 to be a decrease key okay so we can 03:38 reduce the value of a key in the 03:39 structure and we'll be able to do that 03:42 in order one amortize complexity now 03:46 that's important because a large number 03:49 of greedy algorithms in particular say 03:52 Dijkstra's shortest path algorithm and 03:55 the kruskal's minimum spanning tree 03:59 algorithm which you're probably familiar 04:00 with they employed the decrease key 04:04 operation and if you use the Fibonacci 04:07 heap as we'll see couple lectures from 04:10 now to implement those algorithms you 04:13 get better performance than using any of 04:15 the other structures you might be 04:16 familiar with okay so really our 04:19 objective is to use this as a stepping 04:21 stone to get to Fibonacci heaps instead 04:24 of developing Fibonacci heaps in one big 04:26 shot we're going to do it in two steps 04:27 first do the binomial heap which is a 04:30 little easier and then augment it with 04:32 the other two operations okay all right 04:36 so let's take a look at what a binomial 04:40 heap is 04:40 again there are two versions depending 04:42 upon whether you want a min priority 04:44 queue or a max and we just look at one 04:46 of these versions the min binomial heap 04:49 okay well the min binomial heap is 04:54 simply a collection of men trees at this 04:56 time we're not going to place any 04:57 constraint on the structure of these min 04:59 trees will simply say that it's a 05:02 collection or a set of min trees now 05:06 we'll see that by the way we define the 05:09 operations on ax men binomial heap some 05:13 structure gets imposed okay so the rest 05:16 of the definition will come later as a 05:18 result of the way in which operations 05:22 are performed alright so this time it's 05:26 just a collection of men trees so for 05:28 example here is say min tree and I guess 05:33 the root has a - I'm not sure if you can 05:34 read the numbers from where you are you 05:37 can't okay so I can hardly read them 05:39 from here all right so you got a - then 05:41 you got a 4 or 5 and a 9 6 7 6 and an 8 05:43 so it's a min tree okay now I meant by 05:47 normal heap is a collection of min trees 05:49 which means you can have multiple min 05:53 trees in it doesn't have to be just 1 or 05:55 0 if it's empty so here's another 05:59 mentary so these two together could 06:01 define a min binomial heat right 06:04 simulate these 3 min trees together 06:06 could define a min binomial heap or 06:08 these four together these 5 or these 6 06:12 went too far okay all right so these 6 06:20 min trees could together define a min 06:24 binomial heap and that's the example 06:25 we're going to be working with yeah 06:31 janna heap rightly be cold as a forest I 06:34 mean sure you can say that this is a 06:36 forest of min trees okay right so father 06:42 suggest a collection of min trees a 06:45 collection of trees so you could use the 06:47 term forests here okay right any 06:52 questions about what I meant by normal 06:55 heap is okay so it's some number of men 06:59 trees in terms of representing it in 07:07 memory the note structure we're going to 07:08 employ is each node is going to keep one 07:13 of the elements okay or one node of a 07:18 min tree will get mapped into one node 07:19 in our memory and each node is going to 07:22 have a degree field tell you how many 07:23 children the node has okay it's going to 07:27 have a child field which you point to 07:28 one of the children doesn't matter which 07:30 one okay and then we're going to use a 07:37 sibling field which will allow us to 07:39 connect together the remaining children 07:42 of a node and so we form a circular 07:48 linked list of siblings and then there's 07:54 going to be a data field and that keep 07:55 track of the element in a node which was 07:58 shown by the numbers in the previous 08:00 picture all right so right so we have a 08:09 collection of min trees we just talked 08:13 about the structure of a node so all of 08:19 the siblings get linked together to form 08:21 a circular list okay so these three 08:25 fellows will get linked together to form 08:26 a circular list these two will get 08:28 linked together to form a circular list 08:30 these two green fellows will form a 08:33 circular list 08:37 each node will have a pointer to one of 08:39 its children so this note could point to 08:41 this child that child or that child and 08:43 then they're all in a circular list you 08:45 can get to the other children each node 08:48 will have a degree field so this note 08:49 have a degree of three okay then you 08:52 have a data field which allows you to 08:53 keep the element okay so whatever 08:56 elements being stored in the node you 08:57 keep in the data field the question is 09:21 if you link together the siblings in the 09:24 circular linked list doesn't it violate 09:25 the definition of a tree well firstly as 09:31 far as the tree is concerned the only 09:32 requirement is that you have the 09:33 parent-child relationship amongst notes 09:35 okay a tree is an abstract entity how 09:42 you represent a tree you have complete 09:44 flexibility okay 09:46 so simply because I have chosen to link 09:50 together the siblings doesn't change the 09:52 underlying structure that I'm 09:53 representing I'm still representing a 09:55 tree but I can represent it any way I 09:57 like if I take this and map it into an 10:00 array it doesn't change the fact that 10:03 I'm representing the tree I've changed 10:06 the representation but I'm still 10:07 representing this tree okay so you can 10:12 take any object and represent it any way 10:14 you like it doesn't change the object 10:16 you're still representing that object 10:19 okay and that's what I'm doing here I've 10:21 got this tree and I've decided to 10:23 represent the tree in a particular 10:25 fashion and that doesn't change the fact 10:28 I'm representing that tree okay all 10:34 right so what you'll notice is that 10:38 we've used up pretty much all of the 10:40 fields in the nodes except that these 10:42 roots still have their sibling fields 10:44 free okay so 10:47 the roots will get linked together to 10:49 form a circular linked list as well okay 10:53 and that's what this thing here says 10:54 we're going to link together the roots 10:56 so all of them in trees will also get 10:58 linked together to form a circular 10:59 linked list okay so if I look at all the 11:02 pointer shown and the picture is going 11:03 to get messy and this is the only time 11:05 we're going to show all of these fields 11:07 because the pictures get messy when you 11:08 show all the fields okay so I'm going to 11:12 link together all of the min trees 11:15 through their roots using the sibling 11:17 field and that gives me a circular list 11:20 of min trees at the top level okay I'm 11:26 also going to link together all siblings 11:29 using the sibling field that was the 11:31 purpose of defining a sibling field in 11:33 the first place so all of these will get 11:35 linked together and then the parent 11:37 field here points to one of the children 11:39 in this case it's pointing to the first 11:42 one here but it could have gone to any 11:43 one okay do the same thing here okay a 11:49 circular list of siblings and then the 11:52 parent has a child field which would go 11:56 to any one of those two in this case 11:58 it's going to the leftmost one but it 12:00 doesn't have to go to that one could go 12:02 to any one okay if you come down here 12:06 okay so there's a circular list of just 12:09 one fellow okay and then the parent has 12:12 a child pointer coming to that one child 12:16 and that puts in the rest of them 12:27 right in addition to this I'm going to 12:30 keep a pointer which will allow me to 12:32 get to this whole representation and 12:35 that point is going to go to the roof 12:37 that has the smallest element so I can 12:42 access the smallest element in my Prada 12:44 queue in order one time by just 12:46 following that pointer if the pointers 12:48 now you don't have any element it was 12:50 not now it points to the node that has 12:52 the smallest element so you can find the 12:56 minimum element in one unit of time now 13:02 what's missing from this picture I 13:04 haven't shown the degree fields okay so 13:07 these are numbers you can attach them to 13:08 every single node the number of children 13:10 the node has okay all right so I mean by 13:21 normal heap is just a collection of 13:23 winterice in terms of representation 13:27 within a computer we use nodes that have 13:29 a child field a sibling field a data 13:33 field and a degree field if you compare 13:44 this with the representation of a 13:46 leftist tree each node has a data field 13:50 each node as a left child and right 13:52 child pointer and here we have child and 13:55 sibling so that's kind of equivalent in 13:58 the leftist tree use an S value pointer 14:01 and here we use a degree pointer it's 14:05 not point of a degree field okay the s 14:07 values can be at most log N and as we 14:10 will see towards the end of today's 14:12 lecture the degree of these can be at 14:14 most log n so that way both nodes are of 14:19 exactly the same size in both structures 14:22 so space wise binomial heaps and the 14:26 leftist trees require the same amount of 14:28 space per node 14:35 all right now as I said we will never 14:39 again see a picture like this because 14:41 there's no way to figure out what's 14:42 going on in such a messy picture we just 14:45 see the trees represented in their 14:48 abstract form okay let's look at how we 14:53 do the operations on the structure 14:55 that's really how you define the 14:57 operations that provide the rest of the 14:59 structure that's necessary to get the 15:01 stated amortized complexity all right so 15:09 here is my collection of men trees and 15:12 into this collection I want to add an 15:15 element whose priority is 10 oaky is 10 15:19 the way I'm going to do that is I'm 15:22 going to create a men tree that has a 15:24 single item inner 10 and I'm going to 15:31 link it into that top level circular 15:35 list of men trees okay so I'm just going 15:40 to increase the number of entries by 1 15:42 okay so I'm going to link it into the 15:45 top level circular list and you can do 15:49 that in one unit of time you have a 15:51 pointer to one of the nodes in the top 15:53 level list so we just put this one next 15:55 to that in the list okay so adding next 15:58 to a node takes one unit of time and 16:04 then we update the pointer to the 16:07 structure if the new item is smaller 16:10 than what you're currently pointing at 16:12 okay so if this had been a zero or a 16:15 minus six or something then I change a 16:17 from there to here 16:26 all right so an insert runs in order one 16:28 time create a new entry put it into the 16:32 top-level circular list update the 16:34 structure pointer if necessary or any 16:41 questions about how an insert works 16:54 inserting here and assert and inserting 16:57 in a circular linked list it's the same 16:58 thing because that's all I'm doing 17:00 I'm taking my circular linked list of 17:03 trees and inserting this into it so 17:05 that's insertion into a circular linked 17:07 list and then I'm updating the pointer I 17:12 still have other operations to look at 17:34 question is if in an insert all you do 17:38 is you take the new item and create a 17:41 tree of size one and put it in well then 17:44 how do you create these fellows who have 17:47 a height that is more than one if an 17:50 insert doesn't do it well then one of 17:51 the other operations has to do that okay 17:55 so we haven't seen all of the operations 17:57 we only see in an insert and obviously 17:58 if if you start with an empty structure 18:02 and you do a thousand inserts well all 18:05 you're going to get is one thousand one 18:06 element trees because an insert as 18:09 you've correctly noted does not increase 18:13 the size of a single tree it only 18:14 increases the number of trees in your 18:16 collection 18:16 yeah all right other questions yeah 18:40 question is when you insert a tree you 18:43 have to restructure one of the trees in 18:45 the binomial heap well when we insert an 18:48 element okay like ten I create a new 18:51 tree and I put it in I don't change any 18:54 of the other trees okay so I don't 18:58 change any existing tree I just increase 19:01 the number of trees by one it contains 19:02 this one extra element and if this value 19:05 is smaller than what I'm currently 19:07 pointing at then I update the pointer to 19:09 point to this fellow okay because the a 19:12 pointer has to go to the smallest root 19:15 that's the only requirement okay but I 19:18 don't mess around with the other trees I 19:20 don't look at them at all other than the 19:22 root of the smallest fellow okay it's 19:29 hard to believe but insert there's a 19:30 very simple operation that's all it is 19:32 okay I haven't left out anything that's 19:34 all it is and that's why it runs in 19:38 order one time all right let's take a 19:44 look at mode okay so in ml you're given 19:47 two different min binomial heaps there 19:50 to combine them into one so you have to 19:52 circular lists of trees I need to 19:54 combine them into one well that's just 19:56 combining two circular lists into one 19:58 okay all right so here's the min 20:02 binomial heap a you have another one B 20:05 and you want to mail them into what okay 20:07 and that just means you're going to end 20:09 up with a min binomial heap that has 20:12 these two men trees as well as those two 20:15 so for tree min binomial heap when 20:18 you're done a and B do not exist as 20:21 independent min binomial heaps okay 20:23 so I'm just going to combine the two 20:25 circular lists and the way you combine 20:28 circular lists if you recall is you 20:33 change the pointer here okay to point to 20:37 the second element there okay and then 20:40 you change the pointer there 20:41 to point to the second element when you 20:44 do this you of course lose that pointer 20:45 and you do the same operation on that 20:48 side okay and then you lose this pointer 20:51 here alright so you make two pointer 20:56 changes okay 20:58 and once you make these two pointer 20:59 changes okay then you can start at any 21:04 one node at the top so you start at the 21:06 seven you follow the pointers you get to 21:07 five before the yellow point you get to 21:09 four you get to one you come back to 21:11 seven okay so you get a circular list of 21:14 the of all of the trees all right so two 21:20 pointer changes enable you to combine 21:23 the two top-level lists once you've done 21:27 that you need to find out which the 21:31 small which is the smallest root the 21:33 smallest root is either a or its base 21:35 you compare the 1 and the 5 it's 1 so 21:38 that becomes the pointer for the new 21:41 binomial heap so we said the main 21:46 element pointer and that's going to go 21:48 to 1 and now you end up with this 21:51 structure okay I've thrown out the 21:53 pointers that got overwritten 22:11 okay all right so now it is combined two 22:14 circular lists two circular lists can be 22:16 combined in one unit of time by changing 22:18 two pointers and then you update or set 22:21 the new main element pointer and so you 22:24 can do and meld in order one time now 22:32 you'll notice that a mode doesn't change 22:34 the shape of trees so if you started 22:38 with one element trees before the node 22:40 you'll still have one element trees 22:41 after the most okay if you're going to 22:47 get big trees there's only one out of 22:49 one other operation left and that's the 22:51 remove men that's the fellow that's 22:53 going to have to create the big trees 22:57 okay all right so that's all there is to 23:00 a meld you have any questions about a 23:02 mouth and it runs in order one time okay 23:08 all right so the actual complexity of in 23:10 certain meld is order one let's take a 23:13 look at your movement all right if the 23:20 heap is empty there's nothing to remove 23:24 if the heap is not empty then you have a 23:27 main element pointer that tells you 23:29 where the main element is so you can 23:30 find the element easily okay so once you 23:37 find the element here okay we want to 23:41 take this element out okay so it's in a 23:44 top-level circular list we want to take 23:46 it out of its circular list you want to 23:50 throw away that node and then take its 23:52 sub trees and bring them back into the 23:54 top-level circular list all right so 24:01 we're going to take this mint tree out 24:03 of its top-level circular list delete 24:08 the root and reinsert it sub trees back 24:10 into the top-level circular list 24:16 and then update the binomial heap 24:18 pointer but this is the minimum minimum 24:22 stuff you have to do to effect the 24:25 remove main operation if you do these 24:28 things then you should be able to see 24:32 that you have actually removed the main 24:33 element and left behind a proper min 24:36 binomial heap of course thinking ahead 24:40 you're probably thinking and saying well 24:42 if this is all I do 24:43 none of these grow trees either okay so 24:47 we look at at how trees actually grow 24:53 from a size of 1 to anything bigger 24:55 after we've seen how to perform this 24:58 minimal amount of work that's necessary 25:00 to do the remove mint okay so let's 25:03 first take a look at doing those three 25:05 things all right so to take the tree 25:11 that has the smallest element out of its 25:12 circular list okay so that's just a 25:15 remove from a circular list so I've got 25:20 a pointer coming in circular list and 25:23 each of these fellows is really a tree 25:27 okay so I've got a pointer here and I 25:31 want to get rid of that tree now really 25:35 what I want to do is I want to get rid 25:37 of the data here okay I want to get rid 25:40 of the C from my circular list and if I 25:44 think of this as wanting to remove this 25:47 node okay that's going to be difficult 25:50 because to remove this node I'm glad you 25:52 know the fellow behind me so I can 25:53 change this pointer to go over there 25:55 okay so that's hard okay you can do it 25:59 you start here and follow next pointers 26:02 all the way around till you come to the 26:04 fellow who points to you okay but that's 26:06 going to take you time linear in the 26:07 number of nodes in the circular list 26:10 okay so I'm not gonna try and remove 26:14 that node physically the objective was 26:17 to remove the element that's stored 26:18 there which is C 26:22 all right to remove the element I say 26:25 well is there a next node different from 26:29 this fellow if there isn't well then you 26:31 have an empty structure so you're done 26:34 okay but I suppose there is a next node 26:37 as there is here different from a well 26:40 then what I'm going to do is I'm going 26:42 to copy this data from here to here okay 26:50 now that's taken C out of my structure 26:52 the only problem is I've got two copies 26:54 of D so I'm going to get rid of this 26:57 node get rid of that original copy of 27:00 the okay and for that I'm going to 27:02 change this pointer to that one which I 27:04 can do in one unit of time okay all 27:10 right so the point is I don't care about 27:12 physical nodes I care about the data and 27:15 it's the data see that I wanted to take 27:17 out and I can take the data out leaving 27:21 back a faithful circular list in order 27:24 one time by doing a copy from a forward 27:27 node and removing the forward node okay 27:35 all right so I can do I can take the 27:37 mint tree out of the top level circular 27:40 list of men trees at one unit of time by 27:46 doing a copy and a pointer change okay 27:50 so that was the first thing I had to do 27:52 all right then 27:59 once you take the tree out I'll take the 28:02 root node and throw it away 28:04 okay the child pointer points to the sub 28:07 trees so if you have a circular list of 28:09 sub trees so the child pointer in the 28:11 root which is not shown here pointed to 28:13 say this fellow and oh maybe it pointed 28:16 to that fellow okay and and they're 28:18 linked together in a circular list so I 28:20 got to take those sub trees and put them 28:22 back into the circular list that I 28:26 started with well that started way then 28:29 i took a tree out of okay so I've got 28:32 this circular list okay maybe you 28:35 started with three 28:36 trees we took one out that's the fellow 28:39 that had the smallest item cent left 28:41 foot - the one I took out had two 28:43 subtrees okay that's these that it take 28:47 these two and put them back in okay so 28:49 basically this is combining two circular 28:51 listen to one okay and we've seen how to 28:54 do that in one unit of time all right so 29:08 as far as the minimal work is concerned 29:11 okay there were three steps there one 29:15 was to take the min tree out of its 29:17 circular list we can do that in one unit 29:19 of time the second step was to reinsert 29:22 the subtrees of the mentary we can do 29:24 that in one unit of time then the third 29:27 step is to update the men element 29:29 pointer well to update the main element 29:36 pointer it could be the root of any one 29:38 of the trees in that top-level circular 29:40 list okay so the only way to update it 29:46 is to go and physically examine all of 29:48 those roots and see which one has the 29:50 smallest element and that could take you 29:54 time linear in the number of trees which 30:00 could be n so for look at this whole 30:03 thing 30:03 okay so removal entry takes one unit of 30:06 time 30:07 reinsert the trees takes one unit of 30:09 time but update the pointer takes you 30:12 time linear in the number of trees you 30:14 have after you have done these two 30:17 things okay and that's and that s could 30:22 be as much as the number of elements 30:24 okay so for example if you if you do n 30:29 inserts okay then you had entries you do 30:34 a delete min operation one tree 30:36 disappears doesn't have any sub trees to 30:39 put back so now you got n minus one 30:40 trees and to update the pointer you got 30:43 to look at all N minus 1 of them 30:44 spending order n time 30:55 all right so with the way we've got the 30:59 operation set up right now and insert 31:03 takes order 1 ml takes order 1 and a 31:08 remove meant take sort of N and there is 31:12 no way to get trees whose size is more 31:14 than 1 ok so really all you're building 31:19 a circular lists of elements all right 31:26 now well this is sufficient to get a 31:29 structure it doesn't enable us to get a 31:31 structure that has the right size 31:34 complexity the amortized complexity we 31:37 claimed in order to get that amortized 31:40 complexity it's necessary to do more 31:43 than the minimum amount of work needed 31:44 to do the remove meant and this extra 31:52 work is what's going to build trees of 31:55 size bigger than 1 and this extra work 31:58 will enable us to have a structure for 32:02 which the amortized complexity is login 32:04 for the remove operation and one for 32:06 each of the other two and we're going to 32:08 be able to prove that ok so we need to 32:15 do more work than this and the more work 32:22 is going to be that in a remove men when 32:31 we're done with this with this extra 32:33 work we're going to make sure that all 32:35 the trees that are left behind have the 32:37 same degree 32:37 okay so we'll combine together trees 32:40 whose degree is the same ending up with 32:43 a tree whose degrees one more and we 32:46 will ensure that no two trees that we 32:48 leave behind have the same degree so 32:51 it's going to be some extra work done 32:54 now to do this extra work we're going to 32:57 have to examine every tree that is left 32:59 so if you're still thinking about take 33:02 out the mint reinsert the subtrees well 33:04 then look at all the trees that are left 33:06 we're going to have to look at them all 33:07 and we're going to have to do combining 33:09 because you have to look at them all in 33:11 any case to find the newman element all 33:18 right so we're going to really not 33:21 perform the reinsert step as described 33:26 and we really don't even need to do the 33:30 remove tree from its circular list as 33:32 described because we're going to have to 33:34 look at all of the remaining trees in 33:36 any case so I might as well merge all of 33:39 these things into one gigantic step 33:40 where I look at all the trees in the top 33:43 level circular list except for the one 33:45 that you're taking out and look at all 33:47 the sub trees of the tree you're taking 33:48 out and combine together pairs of trees 33:52 that have the same degree okay let's see 33:56 how this operation works okay alright so 33:59 go back to my original example okay so 34:02 I've got all of these fellows and let's 34:06 assume this is the state after you have 34:07 done a remove Minh 34:13 so after I've done my remove men I'm 34:16 left with these seven trees some of them 34:19 original top level trees summer sub 34:21 trees of the nodi throughout I can't 34:26 leave this configuration because this 34:29 configuration has some trees of the same 34:31 degree these three are of degree zero 34:33 okay these two are of degree one okay so 34:38 the objective is when you finish with 34:41 the remove when the trees you leave 34:43 behind will all have different degrees 34:50 okay so to meet that objective I will 34:56 examine all of my trees in some order 35:00 doesn't matter what order okay 35:02 so we consider some watering and as I go 35:07 through I will ensure that no two trees 35:11 have the same degree and computationally 35:16 to make this thing happen quickly I will 35:18 keep track I will use a table which will 35:23 keep track of trees seen of certain 35:25 degrees okay so let's suppose that the 35:28 maximum degree we expect is four so then 35:31 my table will have entries 0 through 4 35:34 so initially the table is empty I 35:36 haven't seen any trees then I sweep from 35:39 left to right in whatever order you 35:40 decided to look at these so this tree 35:43 here I use its degree field up here 35:46 which tells me it's a tree of degree 3 35:48 ok so I take that and I put it into my 35:52 table ok at position 3 then I go to the 35:58 next tree which is the yellow tree there 36:00 and that tree has a degree of 0 so I'm 36:06 going to cross that into my table ok 36:10 then I go to the next one ok that's also 36:16 got a degree 0 likely pass it into my 36:19 table but the table's part is used okay 36:21 so that tells me already seen the tree 36:22 of degree 0 so now I've got two trees of 36:24 degree 0 that's not allowed so I'll 36:26 combine these two together so the way I 36:29 combine ok so that table tells me where 36:32 this other tree of degree 0 is ok this 36:34 one ok I'm going to combine them I got 36:37 to be left with them in tree so the 36:39 fellow with the bigger root is made a 36:41 subtree of the fellow with the smaller 36:42 root in case of a tie you can go either 36:44 way 36:45 ok so in this case 10 becomes a subtree 36:48 of 4 36:58 so that gives me a tree of degree 1 okay 37:03 so I take that tree of degree 1 and I'm 37:05 going to put it in the table okay so the 37:06 tree of degree 0 is gone because I 37:08 combined it with another tree of degree 37:09 0 when I created three of degree 1 so I 37:12 update my table okay degrees euros gone 37:16 and got a degree 1 alright then I 37:21 continue I see the blue fellow that's 37:23 degree 1 okay I checked my table which 37:27 says you already got a tree of degree 1 37:29 that's the yellow one so you combine the 37:31 two together okay and the rule is the 37:38 same the fellow with the bigger root 37:40 becomes a child of the fellow the 37:42 smaller roots of 5 becomes a subtree of 37:44 4 and when you do that the degree goes 37:49 up by 1 so the degree becomes 2 and then 37:54 we need to update the table the tree of 37:56 degree 1 has gone it's been consumed and 37:58 you now have a tree of degree 2 well 38:04 then I go to the next tree that has to 38:06 be looked at its degree 0 I haven't got 38:13 a pending tree of degree 0 my table has 38:15 a null pointer there so I put it in the 38:17 table then I go to the next tree the 38:21 green one 38:24 it's degree is 2 I checked my table 38:28 there's only a tree or degree 2 pending 38:30 there's a yellow one so combine the 38:32 yellow and the green trees the fellow 38:34 this the bigger root the yellow one 38:37 becomes a subtree of the fellow with the 38:39 smaller root 38:48 that gives you a tree of degree 3 so 38:50 whenever you combine the degree goes up 38:52 by one okay so I like to put this on the 38:56 table okay but look at the table and I 38:58 see I have a pending tree of degree 39:00 three okay so I forward that point I get 39:03 hold of this tree and I combine these 39:05 two together like that gives me a tree 39:07 of degree 4 okay the fellow with the 39:09 bigger root this guy 2 becomes a subtree 39:12 of the fellow of the smaller root giving 39:14 me a tree of degree 4 okay and then I go 39:23 and look at my table and see if I have a 39:25 pending tree of degree 4 if so I'll 39:27 combine them together to get a tree of 39:28 degree 5 and so on okay in this case 39:31 table for entry is empty so there's no 39:34 pending tree of degree 4 I put this 39:36 fellow in the table alright then I look 39:43 at the remaining tree which is the red 39:45 one degree 1 39:47 I check my table to see if I have a 39:49 pending tree of degree 1 I don't so it 39:51 goes into the table right so using this 39:57 process by the time you're done 39:58 examining all of your trees okay your 40:01 table will contain all of the trees that 40:04 survived and they all have different 40:06 degrees okay then I sweep through the 40:11 table collecting the trees link them 40:13 together into a circular list and find 40:16 out which one has the smallest root 40:25 all right so then you create a circular 40:28 list of the remaining trees when you 40:32 create the circular list of the 40:33 remaining trees we're going to examine 40:34 this either from 0 through 4 or 4 40:36 through 0 and as you pull the trees out 40:38 I'll set these spots to no 40:41 reinitializing the table the table is 40:44 initialized only once at the start of 40:46 the whole program after that every 40:50 remove mint leaves once done leaves the 40:54 table with all no entries also where the 41:02 complexity will remove when you need to 41:07 at the start of the program create and 41:09 initialize the table and if the maximum 41:12 degree of a tree is max degree then you 41:15 need this much time to create that array 41:17 and set all the pointers to no you need 41:26 to examine all of the trees we're saying 41:29 there are s of them after you have taken 41:31 out the min tree throw away the root and 41:34 put back its sub trees you got s of 41:36 those ok 41:37 so we're to examine those left to right 41:39 one at a time 41:42 the examination itself could involve 41:46 many combines ok so you may have a tree 41:50 of degree 0 that gets combined with 41:52 another tree of degree 0 gives you one 41:54 of degree 1 that combines of the tree of 41:56 degree want to get one of degree 2 which 41:58 combines with the tree of degree 2 to 41:59 give you one of three and so on so on 42:01 each tree you could spend a lot of time 42:04 you could spend max degree amount of 42:05 time but each time we do a combined the 42:08 total number of trees reduces by one so 42:11 if you started with seven trees one 42:12 combined will bring you to six another 42:14 combined bring it to five so the maximum 42:16 number of combines you can do is s minus 42:19 1 ok so again we're using an aggregate 42:23 type analysis the total time you can 42:25 spend on the combines is order of s 42:27 because you can't do more than S minus 1 42:29 combines all told in each combine takes 42:34 one unit of time 42:35 okay so it takes order s time to examine 42:40 all of the trees and do all of the 42:42 combines then you have to collect all 42:48 the trees from your table the table has 42:51 max degree plus one spots in it so it 42:54 takes the order of max degree amount of 42:56 time to collect all the trees and to set 42:58 the min heap pointer and link them 43:00 together and also to set all the entries 43:04 to now again alright so your total time 43:12 becomes max degree plus s now what we'll 43:21 see is that max degree is log n ok but s 43:25 can still be n or n minus 1 okay so that 43:30 doesn't change the actual complexity of 43:33 this operation from order n and remains 43:36 order n ok it's just that with this new 43:39 complexity of max degree plus s 43:42 resulting from this pairwise combined 43:44 we're going to be able to prove that the 43:46 amortized complexity is log n even 43:49 though the worst case complexity / 43:51 remove 'men is still order of n all 44:00 right so the actual complexity remains 44:02 order n despite this additional work 44:08 that we're doing 44:13 all right let's today prove that the 44:17 maximum degree is log N and then the 44:20 next lecture we will do the amortized 44:21 complexity analysis all right so let's 44:26 define something called a binomial tree 44:28 just where the structure gets its name 44:31 so BK is a binomial tree whose degree is 44:34 K by definition b0 is a single node tree 44:40 and when K is bigger than 0 by 44:43 definition BK is comprised of one 44:46 instance of each of the other smaller 44:48 B's okay so you got to have 1 B 0 or B 1 44:53 or B 2 up to BK minus 1 to get a BK ok 44:58 so that's the definition through an 45:01 example so that's B 0 ok B 1 is a root 45:06 and then you gotta have a B 0 a B 2 is a 45:10 root and you could have a B 0 and a B 1 45:12 a B 3 is a root you could have a B 0 B 1 45:18 and B 2 as it sub trees if you count the 45:25 number of nodes here it's 1 here 2 4 8 45:30 ok yeah oh okay so you said a B 1 has as 45:41 a sub tree b0 a b2 has sub trees b0 and 45:45 b1 why can't it have two sub trees that 45:55 are b1 well that's not the definition of 45:58 a binomial tree the definition of a 46:01 binomial tree as we had on the previous 46:03 slide is BK has sub trees B 0 through BK 46:08 minus 1 exactly one of each that's the 46:10 definition of a binomial tree okay so b2 46:14 by definition is going to have a BZ row 46:17 and a b1 b3 by definition is going to 46:19 have a B 0 or B 1 and a B to the order 46:22 in which you 46:22 is not important and at least from this 46:29 picture it looks like a BK has 46:32 exponential number of notes okay so you 46:35 got one two four eight 46:40 well then sub K be the number of nodes 46:42 in BK so B zero has one notes and zeros 46:49 one and if you look at a general BK okay 46:53 so we got one copy of B 0 1 of B 1 and 46:56 so on so it's going to be n sub K is n 0 46:59 plus N 1 plus n 2 plus n K minus 1 plus 47:03 1 and you can prove by induction that's 47:08 2 raise to K an alternative definition 47:16 of BK is that BK is made up of 2 BK 47:20 minus ones and to see that if you look 47:27 at B 1 okay it's a b0 a b0 1 is a 47:30 subtree of the other okay if you look at 47:37 B 2 it's two big ones if you look at B 3 47:46 its to be twos so starting with the 47:52 first definition you can prove the 47:54 equivalence with the second definition 47:55 that is a B K for K bigger than 1 is to 48:00 be K minus ones one a subtree of the 48:02 other ok and from this it's quite easy 48:07 to see that the number of elements 48:08 doubles each time you increase K by one 48:10 so the number of elements is 2 raise to 48:11 K 48:17 yeah 48:25 that's right so by definition okay 48:31 there's a definition for a binomial tree 48:33 which was BK has sub trees 1 B 0 1 B 1 1 48:40 B 2 so on up to 1 BK minus 1 so that's 48:43 by definition if you don't satisfy that 48:46 you're not a binomial tree okay and for 48:52 the time being that has nothing to do 48:53 with binomial heaps in a moment we'll 48:55 see the connection but binomial tree is 48:57 a concept independent of binomial heaps 49:02 okay all right and then we had this 49:05 equivalent definition and from this 49:07 definition as I said it's easier to see 49:08 that the number of elements is 2 raised 49:10 to K now if you think about binomial 49:16 here's okay 49:19 the only trees that are created other 49:24 than a remove meant operation our single 49:28 node trees so those are be zeroes so 49:30 whenever you do an insert you create or 49:31 B 0 when you do a meld you don't create 49:34 anything new but when you do a remove 49:38 men you either leave behind trees that 49:42 you started with or if you create 49:44 something new is the result of combining 49:46 two trees of the same degree okay so 49:51 using that you can see that the only 49:53 thing you can create a binomial trees 49:55 because a binomial tree is two trees at 49:57 the same degree one made a subtree of 49:59 the other for the induction base trees 50:02 of size 1 or B zeroes okay so you can 50:08 see that the only trees created with the 50:13 operations the way we defined you get B 50:16 zeros from an insert everything else has 50:18 created comes from a remove men and 50:20 that's by combining trees of equal 50:22 degree so it's got to be a be K for some 50:24 K okay so we get only binomial trees and 50:29 since binomial trees have an exponential 50:32 number of elements if you have n 50:34 elements in a tree its degree can't be 50:36 more than log event so max degrees log n 50:39 n is the total number of elements 50:46 alright so we're going to stop here 50:47 we'll use this result next time to prove 50:51 the amortized complexity for binomial 50:52 heap operations