Transcript 00:16 okay alright then have any questions 00:20 you'd like to ask first nope 00:25 alright so we're going to start looking 00:27 at data structures for single-ended 00:31 party queues today we'll start by 00:34 looking at the leftist tree and I'll 00:38 first define what I left this tree is 00:39 and then we take a look at how you can 00:41 perform the various single-ended 00:44 priority queue operations with a leftist 00:46 tree now what we'll see is that although 00:50 we already know of a data structure to 00:53 perform single-ended priority queue 00:55 operations very efficiently namely the 00:58 heap the leftist tree is going to allow 01:00 us to perform an additional operation 01:02 which is that of melding of combining 01:05 two single priority queues into one in 01:08 an efficient manner so if you take two 01:12 heaps suppose you have two min heaps and 01:14 you need to combine them into one okay 01:16 that for example might happen if you 01:18 have let's say you have several queues 01:21 and one of the Silva's dies then you 01:23 want to take the queue associate with 01:25 that server and combined it with one of 01:27 the other queues okay so if you want to 01:29 meld two heaps together well then the 01:32 heaps are typically sitting in separate 01:34 arrays and now you need to combine two 01:38 structures in two different arrays into 01:40 a single race you have to copy the 01:42 elements from at least one array into 01:43 the other one okay so you need to go 01:45 into some linked pipe scheme if you want 01:47 to avoid spending time linear in the 01:50 size of one of the two structures now 01:53 what we will see is that the leftist 01:54 tree is a fairly simple structure which 01:56 will allow us to meld to single ended 01:59 party queues in logarithmic time okay 02:03 all right so a leftist tree it's a 02:06 linked binary tree and it'll allow us to 02:11 do everything that a heap can do in the 02:13 same asymptotic amount of time so for 02:18 example you could insert an element into 02:19 your collection you can remove the main 02:23 element if you have a min version of a 02:25 leftist tree or 02:26 the max element if you have a max 02:28 leftist tree you'll be able to 02:31 initialize okay so the first two 02:33 operations will take logarithmic time 02:34 the initialize would take linear time 02:36 just as it does when you use a heap but 02:40 in addition to that we'll be able to 02:42 perform an operation called meld that 02:46 will allow us to combine the elements 02:47 and two priority queues into one right 02:54 to define a leftist tree 02:56 we'll use the notion of an extended 02:58 binary tree something we've talked about 03:00 earlier you simply take a conventional 03:04 binary tree and wherever there's a null 03:06 pointer you attach an extra node called 03:08 an external node okay all right so the 03:17 build nodes inside are the normal nodes 03:20 in the binary tree those are the ones 03:22 you represent the red nodes have put in 03:25 wherever there was a null pointer 03:26 otherwise and that gives me an external 03:28 or an extended binary tree we know from 03:34 earlier statements that the number of 03:37 external nodes always one more than the 03:39 number of internal nodes all right so 03:43 I'm the extended binary tree I'm going 03:46 to define a function s and this function 03:50 is defined at every node basically it 03:53 gives me the length of the shortest path 03:55 from that node to an external node in 03:58 its subtree so for example if you look 04:05 at this extended binary tree then for 04:10 all of the external nodes the length of 04:12 the shortest path from there to an 04:14 external node is zero you just start 04:17 from here and end here you get to an 04:18 external node so all of those nodes have 04:21 an S value that is 0 ok now if you look 04:28 at this node here the nearest external 04:32 nodes in a subtree there are two 04:34 external nodes these two and both of 04:36 them are a distance one from here so the 04:38 S value there is going to be one 04:40 the same is true for this node the s 04:42 value will be 1 and then if I go to that 04:45 node over there 04:46 the s value will also be 1 okay so come 04:50 here this follows that for external 04:53 nodes in a subtree all of them are a 04:55 distance 2 so the shortest is a distance 04:57 2 this fellas got 2 external nodes both 05:00 her a distance 1 so the s value is 1 05:03 this external node here has sorry this 05:07 internal node has 3 external nodes in 05:09 its subtree this one is a distance 1 and 05:12 these 2 or a distance 2 so they'll have 05:14 an s value of 1 coming here ok this 05:21 fellow has got 4 6 external nodes ok 05:25 these 2 are a distance of 2 and those 05:28 four are a distance of 3 so the s value 05:31 there will be 2 in fact it's not hard to 05:35 see that the s value at an internal node 05:38 if is obtained by looking at the s 05:42 values that is two children okay you 05:45 take the smaller of the two and add one 05:47 okay so this gives you the nearest 05:49 distance to the nearest external node in 05:51 the left subtree nearest external node 05:53 in the right subtree take the smaller of 05:55 the two and add one you get the s value 05:57 for the know do currently act ok all 06:00 right so same thing here so on to get 06:02 the s value here I take the smaller of 06:04 the s values for the two children that's 06:06 zero and add a 1 so get one moving to 06:11 the root you take the smaller of the s 06:13 values of the two children that's one in 06:15 the add of one to that - okay all right 06:19 so we first define the s value of a node 06:22 length of a shortest path from that node 06:24 to an external node in its subtree or 06:28 any questions about what the s value is 06:36 all right so let's look at some 06:38 properties of s the S value of an 06:44 external node is zero 06:45 and then for internal node we said the S 06:49 is smaller of the s of the left child 06:51 and the right child and then you add one 06:54 okay all right so you can compute the s 06:58 values of a node of any binary tree in 07:01 linear time by using a post order 07:04 traversal and in the wizard step you 07:06 just evaluate this thing here us define 07:15 a leftist tree they're actually two 07:17 types of leftist trees 07:19 one called height biased and the other 07:21 one called weight biased and in class 07:23 we'll look only at the height biased 07:25 version or the height bias leftist tree 07:31 has the property that the s value in the 07:34 left child is greater than equal to the 07:36 s value in the right child for every 07:38 internal node X okay so the s values on 07:43 the left side are at least as big as the 07:45 s values in the right side so look at 07:51 the example tree we were dealing with 07:53 okay if you pick up any internal node 07:56 for example this one the s value here is 07:59 2 it's growing with the s value on the 08:01 right you look at this internal node 08:03 this value here is growing goo there's 08:05 value there look at this one here okay 08:08 the s value here is 0 and that 0 so it's 08:10 great and inclusion on the left then the 08:12 right okay so if you check out every 08:14 internal node there it satisfies the 08:17 requirement for a height bias leftist 08:20 tree so this is an example of a height 08:22 bias leftist tree you can easily get an 08:27 example of a binary tree that is not a 08:29 height bias leftist tree simply swap the 08:32 left and right subtrees of the root then 08:34 the s value here will be 1 and over 08:36 there it'll be 2 and you'll have 08:37 something that's not a height biased 08:38 leftist tree 08:48 if you are given any binary tree you can 08:51 convert it into a height bias leftist 08:53 tree by doing a post order traversal and 08:56 in the wizard step you swap the left and 08:58 right subtrees in case there's a 09:01 violation of the leftist tree property 09:04 so if the s value on the left is smaller 09:07 than the s value on the right swapped 09:09 the left and right subtrees or any 09:15 questions about what a leftist tree is 09:21 also get some properties for leftist 09:23 trees to see why there might be of 09:25 interest in our application yeah a abl 09:34 trees aren't necessarily left histories 09:36 okay there's no requirement there that 09:39 the s value on the left side be ready to 09:41 there's value on the right side we'll 09:43 see that when we look at AVL tree 09:45 several several lectures down the road 09:47 that they're different from left 09:50 histories all right so the first 09:55 property of leftist trees is that the 10:00 rightmost path so you start from the 10:02 root 10:02 keep taking a sequence of right child 10:05 moves till you reach an external node 10:08 okay then that's the shortest way to get 10:12 from the root to an external node that 10:15 follows from the definition where we 10:17 said the s values on the left are ending 10:19 less value on the right so at any node 10:21 if you want to get to the nearest 10:23 external node you will move to the right 10:25 child rather than to the left child okay 10:28 so following the rightmost path is the 10:30 shortest way to get to an external node 10:32 and also the length of that path is the 10:36 s value of the root because by 10:38 definition s of root is distance to the 10:41 nearest external node and the nearest 10:43 external node is found by consistently 10:46 moving to the right 10:50 certainly true in our example here okay 10:54 from here to get to the nearest external 10:57 node you just make a sequence of right 11:00 child moves the number of moves you have 11:03 to make is the s value of the root the 11:16 next property is that the number of 11:19 internal nodes in a leftist tree is at 11:23 least 2 raised to the s value of the 11:24 root minus 1 all right this follows 11:31 because the nearest external node is 11:34 this far from the root okay which means 11:37 if you make if if s root is 5 if you 11:43 make only 4 moves all you can encounter 11:46 are internal nodes ok so that many 11:49 levels have to be populated by internal 11:51 nodes and we're just going to count the 11:54 internal nodes on those levels where we 11:56 know there are no external nodes okay so 11:59 levels 1 through s of root cannot have 12:02 external nodes because the nearest 12:06 external node is s root away from the 12:08 root ok and let me just go back to that 12:15 one all right so if levels 1 through s 12:25 Oh fruit don't have external nodes then 12:29 those levels together have this many 12:31 internal nodes in addition to this you 12:34 may have internal nodes at other levels 12:36 but all we can guarantee is that the 12:39 first s root levels have only internal 12:43 nodes and those levels together have 12:45 this many internal nodes ok so there at 12:48 least this many internal nodes in the 12:50 leftist tree so for example here the s 12:55 value of the root is 2 that tells us 12:58 that levels 1 and 2 have no external 13:00 nodes on them so levels 1 and 2 13:03 have no external nodes on them and 13:05 therefore if you just count the number 13:08 of nodes in the first two levels it's 13:11 got to be 2 to the 2 minus 1 3 ok 13:17 alright so the number of nodes in the 13:18 leftist tree is at least 2 raised to the 13:21 S value of the root minus 1 certainly 13:31 the number of nodes can be larger in 13:33 fact in the example tree the number of 13:34 nodes is much larger than 2 raised to 2 13:37 minus 1 alright the third property is 13:42 that the length of the rightmost path is 13:45 order of log n Big O of log n where n is 13:53 the number of internal nodes in the 13:55 leftist tree right to see this the 14:00 property we just proved property to ok 14:03 the number of internal nodes is at least 14:06 this many ok so the number of internal 14:09 nodes is written equal to 2 raise to s 14:11 minus 1 so from this you get that the s 14:15 value is no more than log of n plus 1 14:19 base 2 okay and then we know from 14:24 property 1 that the length of the 14:26 rightmost path is s okay so the length 14:30 of the rightmost path is s which is at 14:32 most log of n plus 1 base 2 so the 14:35 length of the rightmost path is order of 14:37 log N 14:44 now the length of the rightmost path may 14:46 be much less than log n in fact if you 14:50 look at a left skewed binary tree so 14:55 every node has a left child no node has 14:57 a right child the S value of the root 15:00 will be one okay the root is just a 15:04 distance one from the external node you 15:07 follow a right child pointer from the 15:09 root you fall off the tree you reach an 15:11 external node but this left skewed 15:13 binary tree could have millions billions 15:15 of nodes in it so the length of the 15:22 right post path could be very very much 15:23 less than log n but it's guaranteed not 15:26 to exceed log n plus one base two now 15:33 the value of leftist trees comes from 15:36 the fact that the operations that we 15:37 perform and left his trees are based on 15:40 the length of the rightmost paths we 15:41 only follow the rightmost path from the 15:43 root and so the performance using 15:47 leftist trees is best when the length of 15:50 the rightmost path is very very small 15:52 now it's guaranteed to not exceed log n 15:55 plus one base two but if you're lucky 15:57 you're right most paths could be very 15:59 much less than that and your performance 16:01 could be very much better than you again 16:02 plus one right so the best leftist trees 16:08 to have are in fact the left skewed ones 16:16 let's see how we use these as party 16:18 queues we could have a min leftist tree 16:24 where the structure is a leftist tree 16:26 and the population and the nodes is by 16:29 the min tree property so each node has a 16:32 value that is the smallest in its sub 16:34 tree where these are used for min 16:38 priority queues and then correspondingly 16:41 you can have a max leftist tree where 16:44 the structure corresponds to a leftist 16:46 tree and the way you populate the nodes 16:48 is by the max tree discipline so the 16:51 value in a node is the largest in the 16:54 sub tree of which it is root and those 16:57 are used for max pratik use alright so 17:03 this would be an example of a min 17:05 leftist tree it structurally it's the 17:07 same tree we had before 17:09 so we know it's leftist tree and then 17:12 I've populated the nodes so that the 17:15 keys define M entry the value in each 17:20 node is the smallest in the sub tree of 17:22 widget is root 17:31 or any questions about a min leftist 17:34 tree well let's see how we'll use this 17:41 to represent a single-ended party queue 17:45 and of course this version would allow 17:48 us to do it remove men operation 17:50 efficiently as well as an insert all 17:56 right so a put that's the same as an 17:59 insert a remove meant so that's the 18:02 delete min operation we have a meld 18:05 combine two min left histories into one 18:10 we have an initialized operation so to 18:16 see how these operations work this I 18:21 guess we'll just go down this list okay 18:23 we'll see that the foot and the remove 18:25 men actually employ the meld this is the 18:28 key operation here meld okay once you 18:31 know how to do meld you can do all the 18:33 others in an effective way right so the 18:40 put you have a min left history and you 18:46 want to insert a new item in it okay 18:49 let's say seven so what you do is you 18:52 create a second wind leftist tree that 18:55 has only this single like a minute okay 19:00 so now I got two men leftist trees and I 19:02 want to Mel them together okay so I want 19:05 to meld the red one and the yellow one 19:08 together okay so the insert operation is 19:12 create a new wind leftist tree with one 19:15 item that's easy 19:16 and then invoke the MELD operation 19:19 molding the original one with the new 19:21 one the complexity will be the 19:26 complexity of the mouth 19:32 okay 19:37 you want to do a remove men the main 19:40 items in the route so we'll take the men 19:44 item out of the route and basically what 19:47 we'll do is we'll take away the route 19:49 and throw it away once you do that 19:51 you're left with a left sub-tree and a 19:54 right sub-tree both of which are leftist 19:56 trees and I'll mail them together into a 19:59 single leftist tree so mode the red 20:07 fellow in the yellow folder together I 20:09 get a single min leftist tree again the 20:13 cost of this operation takes one unit of 20:16 time to throw the route away and then 20:19 whatever time it takes you to do the 20:21 mouth so it's really the complexity of 20:22 the mouth that determines the complexity 20:25 of the remove meant so both the insert 20:31 and the delete operation rely on the 20:35 meld and so really what do we need to do 20:40 is see how you do a mouth now to get the 20:45 mail to run in logarithmic time we need 20:47 to be careful not to follow left child 20:50 pointers because if you follow left 20:51 child pointers that distance you might 20:54 travel could be very very large 20:56 we always starting from the root will 20:59 follow only the rightmost path that's 21:02 the part that is guaranteed to be 21:04 logarithmic in length asymptotically all 21:09 right so let's say I want to meld these 21:11 two leftist trees together and would 21:14 accomplish this by starting at the root 21:16 and following only the rightmost paths 21:29 well the strategy is going to be 21:32 recursive okay I will meld the well 21:42 first too if I look at these two fellows 21:43 I know that in the answer the fellow 21:46 with the smaller root will be the 21:48 overall root okay three has to be the 21:50 overall root of the result because you 21:52 have am entry okay so find out which of 21:55 these two fellows have the smaller root 21:57 in this case is the fellow on the right 21:59 okay so then what I will do is I will 22:03 recursively meld the right subtree of 22:06 this fellow okay with all of the other 22:09 tree okay 22:13 having recursively melded the right 22:15 subtree here with all of the other I 22:17 will try and stick it in here the result 22:20 sticking the result here is valid only 22:23 if the S value here is greater than 22:25 equal to the s value of the result over 22:27 here if it's not I'll swap these two 22:30 fellows and I'll be done okay so 22:34 recursively meld the right subtree of 22:36 the fellow with the smaller root with 22:39 all of the other tree symbolically place 22:42 it here do a swap if the S value here is 22:46 smaller than the s value there all right 22:51 so that's basically how it's going to 22:53 work let's take a look at an example 22:58 let's go through this example out here 22:59 okay all right so I want to meld the 23:03 right subtree of the smaller root that's 23:05 the six shown in yellow with all of the 23:08 other tree right okay so this is the 23:17 problem I want to solve meld these two 23:20 trees together so find out which follow 23:22 has the smaller root that's before so I 23:26 will meld the right subtree of the four 23:28 with all of the other guy 23:35 yes I need to Mel the eight in the six 23:37 so find out which fellow has the smaller 23:40 route that's six so if take the right 23:45 subtree of the six which is empty and 23:47 mold it with all of the other guy yeah 23:54 when you melt an empty tree with a non 23:57 empty tree the result will be the non 23:59 empty tree okay so this mode is easy 24:03 because one of the trees is empty the 24:06 result is the other tree so melding the 24:08 right subtree of six with the eight 24:10 gives you an eight when you're finished 24:16 with a melt you take the result of the 24:18 melt and make it the right subtree of 24:20 the smaller root then you check the S 24:28 value here if it's smaller than the s 24:31 value here you do a swap so in this case 24:38 the S value of the left subtree is zero 24:40 and the S value of the right subtree is 24:42 one so we do a swap and it looks like 24:44 this okay then you pop back up one level 24:50 in the recursion here we were melding 24:57 the right subtree of four with all of 24:59 the other fellow and the result is six 25:01 eight so we're going to put this in as 25:03 the right subtree of the four then we'll 25:10 do a swap if the S value here is less 25:13 than the s value there but the S value 25:16 here is a one and the S value here is a 25:19 two so you don't do a swap 25:31 well then you pop up one level in the 25:33 recursion this is the topmost level 25:38 where we had to melt the right subtree 25:40 of the three with all of the other tree 25:43 and we finish with that 25:45 that's this so this now becomes the 25:47 right subtree of the three then you 25:49 right subtree and then we check the s 25:57 values okay so the S value over there 26:02 okay is a 1 and over here say - so the S 26:07 value of the left is smaller than the S 26:08 value of the right so how to do a swap 26:21 okay there during this whole thing 26:26 operation for this thing to work each 26:29 node has to keep its s value okay so you 26:32 can easily compare the s values and then 26:35 you can compute the new s value of a 26:37 node by taking the s value of its right 26:41 child and adding one alright questions 26:48 about how a meld works yeah from here 27:12 all right so now we're at the highest 27:15 level of the recursion and at this level 27:17 we were melting the right subtree of the 27:19 tree with all over the other tree we 27:21 finished with that we end up with this 27:23 okay so now I'm going to make this the 27:25 right child of the three okay that gives 27:30 me this structure once you make somebody 27:34 a right child of its parent then you 27:37 have to look at the possibility of 27:38 swapping the left and right subtrees so 27:41 look at the s value here this value of 27:43 this node will be one because it's 27:44 distance one from its nearest external 27:46 node and the S value here will be two 27:48 distance two from its nearest external 27:51 node since the S of left is smaller than 27:53 as a right we do a swap okay and then 28:00 once you've done the swap we can compute 28:02 the S value of this node this value of 28:05 this node is the s value of the right 28:06 child plus one okay because after the 28:09 swap the right child has an S value 28:10 that's less than equal to be AZ value 28:12 here so as you are unfolding the 28:17 recursion you're updating the s values 28:20 of all n root nodes that you encounter 28:26 about any other questions about the mug 28:36 it's important note that the swapping is 28:39 done based on s value it's not done 28:41 based on height of these trees it's 28:45 quite often we have problems where 28:50 you're asked to kind of do this manually 28:51 and give you trees and we see student 28:53 swapping nodes based on height rather 28:55 than based on s value okay so height and 28:58 s value or two very different things be 29:00 sure that if you have to do any of these 29:02 swaps you're doing it based on s and not 29:05 based on height all right so we've seen 29:15 how to do the insert delete and meld in 29:18 logarithmic time because meld only 29:21 follows the rightmost path let's take a 29:24 look at how you initialize the strategy 29:30 the strategy to initialize and log in 29:32 linear time is you take the N elements 29:35 that have to be put into your left 29:38 history and create single element 29:41 leftist trees so you create n single 29:44 node trees and throw them into a first 29:46 in first out queue then you repeatedly 29:51 remove two leftist trees from the front 29:54 of the queue meld them put them into the 29:57 end of the queue and you keep doing this 30:05 till they left with a single leftist 30:07 tree okay so each time you do this the 30:10 number of left histories decreases by 30:11 one once you have done this n minus one 30:14 times you left with a single particular 30:23 right you can prove this thing works in 30:26 linear time by using a proof which is 30:29 very similar to that used to prove the 30:32 linear time complexity on initializing a 30:35 min heap or a max heap basically the 30:42 arguments the same the first n over 2 of 30:46 these that you do you're dealing with 30:48 nodes whose s value is 1 okay then you 30:52 do n over 4 with the S value maybe - 30:55 then you do n over 8 with the S value 30:56 maybe 3 and that's the same thing that 31:00 happens when you're initializing a min 31:02 heap at the bottom level you have n over 31:05 2 nodes where you're doing this adjust 31:09 operation and the height is 1 then you 31:11 have n over 4 nodes where you're doing 31:13 the just operation with the height is 2 31:14 and so on so basically it's the same 31:16 mathematics that goes here as goes in 31:19 the analysis of a heap initialize right 31:24 you can try that out at home well let's 31:31 turn our attention to another operation 31:33 that you can do efficiently here and you 31:35 can also do it efficiently in a heap 31:37 unless an arbitrary remove okay all 31:42 right so so somebody external to the 31:47 structures keeping track of where 31:48 different elements are and knows where 31:53 an element that he or she wants to 31:55 remove is currently sitting in your 31:56 structure okay so you're given a pointer 32:00 to a node and you have to remove the 32:02 element that's sitting in that node all 32:08 right so it's the person using your 32:11 structure is keeping track of where 32:13 different elements are and comes to you 32:17 and says remove the item that's sitting 32:19 in the node X 32:29 right so we can remove an arbitrary 32:31 element in logarithmic time let's see 32:33 how that works okay the first case says 32:38 X happens to be the root okay so that's 32:40 just to remove min operation okay so we 32:44 just take it out and do two and do a 32:46 melt all right suppose that X is not the 32:54 root okay 32:57 there's several ways in which one can 33:00 handle this remove we describe one of 33:02 them okay we assume that in the 33:07 representation we have parent pointers 33:09 so we can go from a node to its parent 33:11 okay if you don't have parent pointers 33:13 it's going to be very hard to remove the 33:15 item because there is if nothing else in 33:18 this case there's a right child pointer 33:20 coming from here which needs to be 33:21 adjusted all right so we assume we have 33:24 parent pointers okay so I go up and 33:29 access its parent P then I'm going to 33:36 take the left subtree of X and make it 33:39 the right subtree of its parent for the 33:48 climb being I'll take the right subtree 33:50 and keep it out of the structure when I 33:54 make L the right subtree of its parent 33:57 in this example what I'm left behind 34:01 with may not be a leftist tree it's 34:08 entirely possible that the s value out 34:10 here for example is 6 with the s value 34:12 here is 30 so when you bring this up you 34:16 get somebody the next value 30 here and 34:18 you got it a 6 out here okay so this 34:23 transformation where you throw this node 34:26 away you take the left subtree and make 34:28 it the right subtree of its parent may 34:30 end up violating the leftist treaty 34:33 requirements and really what you need to 34:36 do is throw us a path 34:38 beginning at this node P and going up 34:41 toward the root adjusting the s values 34:46 in these nodes and swapping left and 34:48 right subtrees as needed okay now if you 34:53 just follow the path from here up to the 34:55 root you could be in a lot of trouble 34:57 because the length of that path could be 34:59 very large though this diagram looks 35:02 like you know you're just falling right 35:05 these are all right children of their 35:07 parents this could be a zigzag path 35:09 sometimes it's a right child sometimes 35:11 it's the left and because of these left 35:14 child pointers this path could be a very 35:17 very long path okay so as you're 35:22 retracing your way up to the root 35:24 adjusting these s values and swapping 35:28 left and right children you need to stop 35:31 as soon as you reach a node whose s 35:33 value doesn't change from its old value 35:37 okay so so for example here after you've 35:40 brought this guy here if the S value 35:43 here was mine earlier and it remains 35:45 line there's no point going further up 35:46 because everybody's s value further up 35:49 is going to remain what it was before 35:51 this operation okay so you stop this as 35:55 soon as you reach a node whose s value 35:58 doesn't change right so that's the first 36:02 thing you have to do and then once you 36:05 make that early stop then you have to 36:08 prove that the distance you can travel 36:13 from P to the first place where s 36:15 doesn't change cannot exceed log n okay 36:21 that's something you can try and prove 36:23 at home what you can show is that as 36:26 you're going along this path back up 36:29 okay 36:30 the s value of every node you encounter 36:32 has to be one more than the s value of 36:37 the previous node and since no node has 36:40 an S value more than log n you cannot go 36:43 through more than log n such nodes okay 36:47 so something you can try proving 36:49 yourself at home is 36:51 as you are going back up the new s 36:54 values of the nodes have all to be one 36:57 more than the s value of the previous 36:58 node you were at and so you can't go 37:01 through more than log n such nodes and 37:03 so the remove runs in logarithmic time 37:06 okay 37:08 well that's not the whole thing this 37:10 only adjusted this stuff okay we still 37:13 have to worry about the right subtree 37:14 okay so now we can take the right 37:16 subtree and meld it with the adjusted 37:19 left subtree using the mold operation 37:24 okay all right so to repeat the strategy 37:29 is take the left subtree of the node 37:31 that's discarded and make it the 37:34 appropriate child of the parent in this 37:36 case would be the right child of the 37:37 parent travel a path from the parent 37:39 toward the route adjusting the s values 37:41 and swapping left and right subtrees as 37:43 needed stop when you reach the first 37:46 node whose s value doesn't change okay 37:49 then you have to melt that tree with R 37:52 and then you have to prove that the 37:59 first part runs in logarithmic time the 38:00 second one the mold of course will run 38:02 in love with win time an alternative 38:12 strategy is to take this whole thing out 38:15 this don't make L the right subtree of P 38:19 but you still have to go back at justing 38:21 s values okay so you're just s values as 38:24 before you meld Ln R separately and then 38:27 meld with the result that we end up 38:30 doing two melds plus the adjust okay so 38:34 the first way you just do one meld 38:35 preceded by the adjust process okay all 38:42 right so that's something you can I 38:45 guess you really need to prove the 38:48 complexity right questions about how an 38:51 opportunity remove works yeah 38:57 no it may not be it may be out here okay 39:01 no doesn't it could be here it could be 39:05 anywhere 39:06 so that's why the path from X or from P 39:10 to the route could be very very long and 39:12 that's why you have to stop as soon as 39:15 you reach the first node whose s value 39:17 doesn't change and then you have to 39:19 prove that even though you have stopped 39:21 at the first node whose s value doesn't 39:23 change you couldn't possibly have gone 39:25 through more than log n nodes okay 39:28 regardless of where X was as I said the 39:33 way you prove that is by showing that as 39:35 you are moving back okay 39:37 each node e encounter the s value has to 39:40 be one more than the s value of the node 39:43 he just left and since no s value is 39:45 more than log n you can't go through 39:48 more than log n such nodes all right 39:55 other questions 40:04 let me mention a related data structure 40:08 this one's called a skew heap it's very 40:14 much like the leftist tree except that 40:19 we don't maintain s values in any of the 40:22 nodes so you don't keep any s values the 40:32 operations work in pretty much the same 40:33 way as they do in a leftist tree the 40:40 main operation is the melt so when you 40:43 do a meld it's the same thing you take 40:46 the right subtree of the smaller root 40:49 mold it with all of the other tree okay 40:52 now when we were molding in leftist 40:55 trees we had a rule that told us when to 40:57 swap left and right subtrees okay but 41:00 here since we have no s values there's 41:02 no such rule to tell you when to swap so 41:05 what you do is you always swap okay so 41:10 if you took the meld operation for a 41:13 leftist tree and you remove the line of 41:16 code there which said if s of left child 41:19 is less than s of right then do the swap 41:21 and changed it with do the swap 41:23 regardless okay well then you'd get a 41:26 skew heap okay and then since s values 41:30 are never used you don't maintain them 41:31 for this structure okay which is 41:38 slightly simpler than a leftist tree 41:40 because s values are not retained and 41:42 the swap is done regardless you can 41:48 prove that the amortized complexity of 41:50 the insert delete min and meld is 41:55 logarithmic so some operations may take 42:01 a lot of time an individual insert or a 42:05 melt might take order n time but if you 42:09 do any sequence of inserts deletes and 42:12 melts then 42:15 if if that sequence has n operations 42:17 energy a complexity won't be more than n 42:19 log n for that whole sequence okay so 42:25 the skew heap has an amortized 42:28 complexity of log n per operation the 42:30 leftist tree has a worst case complexity 42:33 of log n per operation alright any 42:44 questions 42:58 application well I guess early the 43:05 leftist tree is something you need to 43:06 compare with the heap because that's the 43:07 alternative structure that you have for 43:09 single ended priority queues in a heap 43:12 if you want to do a meld then that 43:14 operation takes linear time so if you 43:17 have one he put twenty elements in 43:19 another he put twenty elements and you 43:20 want to mail them into one since they're 43:24 sitting in to physically disparate 43:26 memory locations you have to copy one 43:30 from one from one set of memory 43:32 locations into the other one because 43:35 typically you map it all into an array 43:38 okay so because of that you're going to 43:42 spend linear time and doing them out so 43:46 so it's really the meld operation which 43:49 is the advantage on the left history 43:56 well it's a question of the application 43:59 if you have to do a meld then you have 44:03 to decide whether you want to spend 44:05 linear time every time you have to do a 44:06 meld or you want to do it in logarithmic 44:09 time a skew heap again is a link 44:16 structure where the normal heap has 44:18 typically mapped into an array 44:20 sequential structure and a skewed heap 44:26 will allow you to do a meld in love with 44:29 make time amortized whereas the normal 44:32 he point 44:45 for well no yeah 45:01 nobody well first thing is a normal heap 45:04 is represented is mapped into an array 45:07 without pointers the space required to 45:15 store normal he was less than a SKU he 45:17 because you don't have the pointers okay 45:18 but because you don't have the pointers 45:22 if you're going to do a melt and you 45:25 have to preserve the property which says 45:28 that if you are knowed the parent is 45:30 that obtained by dividing by two if you 45:32 want to get a left child you to multiply 45:33 by 2 and right child's multiplied with 2 45:35 and add 1 the only way to preserve that 45:37 is to take all the elements and 1 and 45:40 physically copy them in next to the 45:42 other one okay so if you have elements 45:46 sitting in it's a 1 array which is in 45:48 one part of memory another another part 45:49 of memory there's no way around it but 45:51 to take the array in the other part of 45:53 memory and bring it next to the first 45:54 one and that's going to take you linear 45:56 time it doesn't matter what swaps you do 46:00 so so the difference comes about in the 46:04 MELD operation that the leftist tree and 46:07 the skew heap allow you to do melds in 46:09 logarithmic time one worst case in one 46:12 amortized and then it's a question of 46:16 whether your application needs to be 46:18 performing large number of metals or 46:19 doesn't if you don't need to perform a 46:22 lot of melds well then you would 46:25 probably be quite happy with an ordinary 46:27 heap 46:32 okay all right other questions yeah why 46:38 do we need them 46:39 compared to leftist trees or compared to 46:43 okay well since a skew heap doesn't 46:48 maintain the s values okay you get a 46:50 slightly simpler structure so so so the 46:55 reasoning here would be because it's 46:57 simpler maybe if you measure the run 46:59 time over sequences of operations it is 47:03 running a little bit faster than a 47:04 leftist tree okay now in reality I don't 47:08 think it does run faster than the 47:09 leftist tree because all those swaps 47:11 take time you're doing a bunch of 47:14 unnecessary swaps so in reality I don't 47:17 believe this is going to run faster than 47:19 a leftist tree over long sequences of 47:21 operations and therefore it's probably 47:24 not of much use okay yeah 47:33 what's the point of doing the swaps in a 47:35 skew heap or na left is room in a skew 47:41 heap if you don't do the swaps okay then 47:44 you can show that the amortized 47:45 complexity is not logarithmic so to get 47:49 the amortized logarithmic complexity you 47:52 do have to do the swaps 47:53 that's the whole purpose all right other 48:02 questions 48:07 all right so we'll stop you