247 exam 3

Lakukan tugas rumah & ujian kamu dengan baik sekarang menggunakan Quizwiz!

general approach to MST

-start w empty edge set T -keep adding edges to T without creating a cycle until T spans G -how do we know which edge to add next to ensure that w(t) is minimal? greedy principle: -define a local criterion to apply when picking each edge -at each step, pick edge that is currently best by criterion and add to T. keep picking edges until T spans G

insertion into a 234 tree

-try to insert into an existing leaf -if the leaf is full (already has 3 keys) then split overloaded node into 2 notes (median key up to parent/#2 of 4) and then make the others subtree. if the parent is also full split the parent again to create a new root create new roots instead of new leafs

how to test a hamiltonian path & run time

DFS and theta |v| + |E|

cycles and shortest path

DNE

how to get a topological order from DFS

start at the node w no successors and then backtrack and then that's a reverse topological order to get diff versions of this start at diff place and then keep ending and restrarting

shortest path from BFS

strip away the non parent edges. forms tree of shortest path, connecting vertex to the starting point

cycle length

sum of edges weights in cycle

directed graph dfs relative finishing times

the child will have a smaller (earlier finishing time)

partial order

the vertices being less than others if there is a path from the smaller to the bigger (a bunch of inequalities that are transitive). if there is no arrow between two nodes then those nodes are not comparable

directed graphs

there is an arrow pointed in the direction

Dijkstra's algorithm pseudocode (including the ways to figure out which vertex is next)

v.dist<- 0 all other u.dist <-∞ all vertices are unfinished D[v] <- PQ.insert(starting vertex v) D[u]<- PQ.insert(u) while (any vertex is unfinished) v<- unfinished vertex w smallest v.dist for each edge(v,u) if (v.dist+ w(u,v) < u.dist) u.dist<- v.dist + w(v,u) // relax (basically set to smaller dist v finished D[u].decrease(u)

ladder graph

vertex connects only to the vertex directly across from it

how to create a vertex and w updated dist obj

vertexAndDist(x.vertex,d);

longest path w dikstra possibility

w*n-1 (assuming the farthest one had the max weight and you have to go thru n-1 nodes to end from one end to other)

how to make the breadth first search find the distance

while queue is not empty u = q.dequeue() v.distance = 0 v.parent = null for each edge (u,w) where w is a neighbor of u if w is not marked mark w w.distance = v.distance +1 w.parent = v q.enqueue(w) basically the distance of one vertex to another node is the distance of the node it was discovered from + 1

ex of directed graphs

road map (one way streets), transactions, webpage references

path

sequence of directed edges leading from starting vertex to an ending vertex

cost of insertion 234 tree

-splitting O 1 -might do n levels, θ log n -deletion θ logn

weighted shortest path

-a weighted graph assigns each edge a real valued weight w(e). where w(e) >= 0 -the length of the path is the sum of the edge's weights -given a starting vertex v to find the path w least total weight from v to each other vertex in the graph. it may not be the shortest length of edges.

reducing weighted path into an unweighted problem

-apply to every edge using BFS on resulting graph -basically for ex if the weight was 3 then you would add 3 nodes between problem: -only works if weights are integers, expensive w graphs of large weights bc BFS cost is θ|v| + |e| but the vertices in this case would be proportional to the max weight

how to figure out if a graph is bipartite

-assign colors and show that there are no edges between the same color. -if there is any odd length cycle (ex. triangle) then the graph is not bipartite -also can decide with BFS. if we discover a vertex via edge (u,w) label w to be the opposite side from v. claim: graph is bipartite iff BFS never labels both endpoints of an end (u,w) with same side. keep bouncing back and forth, should not get vertices without labeling endpoints with both

graph traversals

-can move between vertices only by following the edges -when we see a vertex for the 1st time, mark it to avoid repeated work 2 strategies: breadth first search, depth first search

tree rotation

-changes the root of tree while maintaining BST property right rotation: -if unbalanced to the left but child is unbalanced to the left too: swing left to the root and make the current root the right child. then the right subtree of the original left child becomes the left child of the old root -if unbalanced to the left and its right child is unbalanced: take the root of the left child's right subtree and move it up to the left child's spot and make the root the new left child. then take the new left child and make it the root of the tree, and move the root of tree to the right subtree do the opposite rotation if right heavy left rotation: -if unbalanced to the right and the right's right child is unbalanced too: swing the right child to the root and the current root becomes its left child. anything that was originally a left child of the new root is now the right child of the original root. -if unbalanced to the right and the right's left child is unbalanced: perform a right rotate on the right child by moving the left child up to the right child's spot and then a left rotate with the new arrangement

dijkstra's alg proof

-claim: when we explore the edges out of vertex v, v has correct shortest path dist stored in current best estimate v.dist base case: start at vertex explored first w correct shortest path dist = 0 inductive step: suppose that the algorithm is about to choose v for exploring. prove by contradiction. assume v.dist > the actual D(start v) (v.dist is wrong). consider the shortest path from start to v. u is the last finished (already explored vertex on the path). by the inductive hypothesis, u had the correct shortest path dist. D(start, u) <= d(start, v) since u precedes v on shortest path to v if all edges have non negative weight then there are 2 cases: if edge u-> v is on the shortest path, then exploring u's outgoing edges assigns v its correct shortest path distance D(start, v) . then we detect a relaxation is possible and v.dist will decrease to its true value. CONTRADICTION case 2: otherwise there is some other vertex x that lies between u and v on path (D start, x) <= D(start,v). and D(s,u) <= D(s, x) <= D(s,v). since v does not have the correct shortest path, dist v.dist > x. dist so x would be explored next not v. so contradiction because v is next to be explored and v.dist <= x.dist and x.dist<= v.dist and the only way that is true is if they have the same distance so the distance would have to be correct, violating assumption

complete bipartite graph

-consists 2 sets. L and R vertices such that all edges go between L and R -ex: jobs and employees, compatibility w relationships

DFS cycle detection

-contains a cycle iff DFSVisit(v) ever finds an edge (v,u) that has been started but not finished ex. DFS visit(c) visits u that has been started but not finished. if it finds a vertex w that has started but not finished, DFS must have been called earlier and not ye done. DFS found path from w to u but edge (u,w) also exists, hence a cycle if G contains a cycle, let w be the 1st vertex found. cycle includes edge(u,w). does not return until it finds every vertex reachable from w. includes u so DFS visit(v) finds unfinished vertex w

uses of DFS

-cycle detection -dependency resolution -reachability -compiler analyses

deletion in a 234 tree

-deleting key from leaf node w two keys: just delete nothing happens -with leaves: do something sort of like a rotation where u rotate one of the children w multiple keys in a node up -if leaves don't have extra keys: then take down a key from the parent -If all of these options fail, it creates a hole in the node's parent, then recursively tries to find a way to fill it using options analogous to rotation and "unsplitting", respectively. If neither option is possible, the hole is pushed up to the next level of the tree, and so forth until it either reaches the root or disappears. If the hole reaches the root, the tree shrinks by one level.

2-3-4 tree

-each node has 1,2,3 keys -a non-leaf node w t keys has t+1 children (2,3, or 4) -natural analog of BST prop holds btwn root & subtrees (the child node between 2 parent nodes has to be between those 2 numbers. left child from left most parent node must be less, right most must be bigger) -every path from root to bottom of tree must be the exact same height -balanced, height has at least 2^(h+1) -1 and every 234 tree w n keys has height O log n -worse case 1 parent and it is a binary tree

running time of dijkstra

-each vertex has 1 PQ insert and 1 extract min -each edge has 1 PQ decrease hence total cost |V|(total insert + total extract min) + |E| Tdecrease PQ w binary heap is θlog v so θ(|V|+|E|) * log(V) slightly faster algs are possible w fancier heaps w O1 decrease time

red-black tree

-every node of 234 tree maps to one black node w 0-2 red nodes as children -never have 2 reds in a row. same # of black nodes in path from root to leaf -height O logn -most widely used, efficiently maintained (ordered set) -don't have to rebalance a lot (use if there's lots of insertions).

DFS cost

-every vertex discovered and marked in time O(1) check all edges that touch. total cost still θ(|v|+|E|)

relaxation of weighted path (dijkstra)

-explore graph while maintaining for each vertex v the length of shortest path to v seen so far. store estimate as v.dist -whenever we follow edge (v,u) check whether v.dist + w(v,u) < u.dist . then update u.dist w new estimate and then continue -dijkstra says to explore edges out of vertex v with smallest v.dist and relax adjacent vertices. stop when each vertex has had outgoing vertex checked once basically start at vertex A and that has a distance of 0. look at its neighbors. A.dist + weight is gonna be less than the infinity so adjust their distances to A.dist + weight. then A is finished because it does not have any more neighbors. then move to the neighbor with the smallest distance and keep going. then keep going until all the vertexes finish. then each vertex will have its shortest distance to start. the order is dependent on the best distances for selecting

DFS

-finds all vertices reachable from given v before completing v -instead of marking assign them v.start (when they are first discovered) and v.finish (after we complete a vertex) set time = 0 DFS visit(v) v.start = time++ for each edge (v,u) if u.start is not yet set DFS visit (u) v.finish= time++ this uses a stack. it starts at A at time 0. then time = 1. start B at time 1. then explore its neighbors. go to D at time =2. then explore its neighbors. go to E and start E at 3. then explore its time, start F at 4. F does not have children so finish F at time 5. then backtrack to E. discover C and start at time 6. C does not have any neighbors finish at time 7 and backtrack to E. no more so finish E at 8. then keep backtracking and close all the others. if finished on starting vertex and there are still more, start calling on an unlabeled vertex not unique could have different ways of doing this terminates when stack is empty

breadth first search (bfs)

-first come, first searched -a FIFO queue that tracks vertices to be searched -initially only contains starting vertex v while queue is not empty u = q.dequeue() for each edge (u,w) where w is a neighbor of u if w is not marked mark w q.enqueue(w) basically you start at the first vertex and mark it, and put in the queue. then you pull off the vertex and look at its edges. mark its unmarked neighbors, and put them in the queue. then pull the first neighbor you put in the queue off and then mark any new neighbors and put them on queue and then pull the next off queue and so on. keep going until you have pulled everything off the queue not unique (multiple ways of doing this search) vertices in the queue are called the frontier (discovered but not visited)

rebalance method

-if balance factor = -2 and root.left balance = -1 then call right rotate on root -if root.left balance = 1 then root.left = left rotate(root.left) and then rotate right on root if get balance = 2 and root.right = 1 left rotate on root. if root.right = -1 then root.right = right rotate on root.right and then left rotate on root

getbalance method

-if both subtrees are null, balance = 0 -if has both subtrees then right height - left height -if only left, left height + 1 for root * -1 -if only right, right height +1

why do edges have to be non-negative for dijkstra

-if you go down a path to the last vertex, then the last vertex will finish and its distance will be updated bc it does not have any children. but if along the path with a greater weight has a negative, it could potentially be a smaller distance but that vertex is already finished so it wont be updated w the correct distance

cost of AVL tree

-insert/delete O h -rotation O 1 / level -total cost O h h is θ log n added cost only O log n so all ops θ logn

when do we rebalance an AVL tree

-insert/removing node may unbalance some subtree rooted at some ancestor of x -to find y try to rebalance subtree rooted at ancestor of x moving up the tree starting with parent -insertions stop after first rebalance, deletions keep checking to root. if Math.abs(getBalance()) > 1

applications to the abstract graph problem

-network design problems -clustering data pts by proximity (remove k-1 largest MST edge to form K clusters) -approx answers to much harder probs (travelling salesperson)

strategies for shortest weighted path

-reduce to an unweighted problem or relaxation w dijikstra's algorithm

why find the shortest weighted path?

-road map w shortest route -routing network w cost to each hop find the cheapest way to send data -state space search w AI w action costs (A* search) the costs and weights could be different (time, fuel, distance, etc)

BFS applications

-shortest distance -bipartite detection -bipartite matching -state space search in AI -facebook uses for friendship. recommends people at distance 2 (people at distance 1 are friends)

alternatives to dijkstra for negative weights

Bellman Ford θ|V||E| very large running time if shortest path distance btwn every pair : Floyd warshall θv^3

dijikstra on dense graph

In general, the Dijkstra's algorithm performs 1 insert into the priority queue and 1 extract min per vertex. Then each edge has 1 priority queue decrease. Therefore, the total cost of Dijkstra's algorithm in general is the number of vertices times (the time to insert + the time to extract min ) + the number of edges times the time to decrease, which is Theta (|V| + |E|) log V In a dense graph, the graph has close to the maximum number of edges. The maximum number of edges in a graph is $n(n-1)/2$ where n is the number of vertices. Therefore, the asymptotic number of edges in a dense graph is Theta n^2. So, plugging into the general Dijkstra's total cost, the running time would be Theta (|v| + |v|^2) log v or Theta v^2 log v.

dijistra on sparse graph

In general, the Dijkstra's algorithm performs 1 insert into the priority queue and 1 extract min per vertex. Then each edge has 1 priority queue decrease. Therefore, the total cost of Dijkstra's algorithm in general is the number of vertices times (the time to insert + the time to extract min ) + the number of edges times the time to decrease, which is Theta (|V| + |E|) log V because with a min heap these all take logV time. In a sparse graph, the graph has close to the minimum number of edges. Therefore, there would be one edge per vertex. Therefore, the parts with E from the general Dijkstra total cost would equal V, and the running time would be Theta |v+v|logv or just Theta |v|logv

fibonacci heap dijkstra dense

In general, the Dijkstra's algorithm performs 1 insert into the priority queue and 1 extract min per vertex. Then each edge has 1 priority queue decrease. Therefore, the total cost of Dijkstra's algorithm in general is the number of vertices times (the time to insert + the time to extract min ) + the number of edges times the time to decrease. So if takes logn time for an extract min and constant time for insert and decrease, the Fibonacci Dijkstra run time would be Theta |V| + |V|log v + |E|. In a dense graph, the graph has close to the maximum number of edges. The maximum number of edges in a graph is n(n-1)/2 where n is the number of vertices. Therefore, the asymptotic number of edges in a dense graph is Theta n^2. So, plugging into the Fibonacci's Dijkstra's total cost, the running time would be Theta v+ logv+ v^2 or just Theta v^2.

fibonacci heap dijkstra sparse

In general, the Dijkstra's algorithm performs 1 insert into the priority queue and 1 extract min per vertex. Then each edge has 1 priority queue decrease. Therefore, the total cost of Dijkstra's algorithm in general is the number of vertices times (the time to insert + the time to extract min ) + the number of edges times the time to decrease. So if takes logn time for an extract min and constant time for insert and decrease, the Fibonacci Dijkstra run time would be Theta |V| + |V|log v + |E|. In a sparse graph, the graph has close to the minimum number of edges. Therefore, there would be one edge per vertex, so edges would equal vertices. Therefore, the parts with E from the Fibonacci Dijkstra total cost would be equal to V, and the running time would be Theta v + vlogv + v or just Theta vlogv.

sparse graphs

O n examples ladder, tree

costs of adjacency list

space θ (|v| + |e|) bc edges are θ|v|^2, this is still θ|v|^2 time to see if an edge (u,v) exists O (|E|) time to enumerate all the edges θ (|v| + |e|) bc edges are θ|v|^2, this is still θ|v|^2

recursive helper functions

The helper methods are used because an additional parameter is needed to be able to perform the recursion. To be able to do the recursion, there needs to be a parameter for the root so the root could be updated, so this helper method includes the extra parameter that allows the recursion to be performed.

why does the child pointer need to be updated in insert

The pointer must be updated because when the insert method is called, the tree might become unbalanced. Therefore the tree would be rebalanced and the node at that pointer might change to make the tree balanced. So, the pointer must be updated so that it has the correct child after rebalancing occurs. or just because a new child could be added to that node

cost of adjacency matrix

space θ|v|^2 time to see if an edge (u,v) exists O1 bc direct lookup at a row time to enumerate all the edges θ(|v|^2)

how to do vertex/edge w arrays

To do this, a 2d array would be used. I would make the vertex private id field public so it could be used in this array. This id would serve as the index for the vertex. Then, because an edge is attached to two vertices, the edge would be located at the spot in the 2d array where the vertexTo would be the row index and the vertexFrom index. In this spot, the edge's weight would be stored.

how to code the return path for dijkstra's

\\get the edge it came from Edge edge = this.parentEdges.get(endVertex); while(edge != null) { //storing the edges of the path in a list //since working backwards add the next vertex at the front of the list path.addFirst(edge); //to iterate through backwards, find the parent vertex and get its edge and use that edge Vertex newVertex = edge.from; edge = this.parentEdges.get(newVertex); return path

cycle

a path that starts and ends at the same vertex

hamiltonian path

a path that touches every vertex

tree graph

all connected but no cycles

topological order

any ordering of vertices consistent with the partial order defined by the edges. may have more than one if some are not comparable. those that are not comparable are "parallel" ex. getting dressed or pre-reqs for a class. can really do in any order as long as the pre-reqs for something are satisfied first for the DFS if you go in backwards order of the time to the end it will give you a topological ordering if the graph has a cycle there cannot be a topological order

adjacency list

array of linked lists each list holds the vertices that the index has an edge to if there is symmetry it is an undirected graph

DFS directed graph prove that node its going to has an earlier finishing time

because you close up going backwards cuz if there's no children you close up that node and then backtrack and close the other so the child node will close up first

undirected graph

bidirectional, same as two directed arrows going both ways

when do we call update height

call update height on the root every time something is inserted or deleted. then call it on both the old root and new root after a right/left rotate

shortest path proof

claim: BFS enqueues every vertex with D(v,w) = d before any vertex x with D(v,x) > d. corollary: BFS assigns every vertex correct shortest path distance from v NB: if graph is not connected some vertex may be unreachable from v (but their distance = infinity) base d= 0: v itself is enqueued first and has D(v,v) = 0 inductive step: -consider vertex w with D(v,w) = d> 0 -there is some u such that D(v,u) = d-1 and edge (u,w) exists by inductive hypothesis, u is enqueued before any vertex with distance >= d. hense by FIFO property of Q, u is dequeued before any vertex with distance >= d. when u is dequeued, w is enqueued if not yet seen. any vertex w distance > d must be discovered and enqueued via edge from vertex at distance >= d which is dequeued after u. therefore no vertex at distance > d is enqueued prior to w (this proves shortest path because the shortest path must go in order w no backtracking. the queue needs to be divided by distances and need to prove this partitioning (also don't care within distance but care across)). if adding in vertex order has to be shortest path because won't get the backwards case

edge

connects 2 vertices in a graph

modified version for dijkstra's that solves Shortest-Paths - Bounded Edge Weights problem in nW+m time

create an array list w buckets n-1*w (one bucket for each weight) iterate thru weight starting from 0 weight in 0 pop out. and put its connecting vertices in the list. indexing least to greatest so from smallest to largest weights

negative edge weight cycle

cycle length is less than 0

graph

describes pairwise relationships among objects. set v of nodes together w set E of edges each pair u & v may be connected by an edge (u,v) or not

if W was no longer constrained by an integer (non constant) then modified vs. dijkstra's run time

dijkstra is v+e log v so n+m logn but modified is nw+m so n*n^2 +m so the modified would be n^3 time so much worse but if constant weight modified would be n+m time which is better than dijkstra's

asymmetric vs. symmetric graphs

directed are asymmetric, undirected are symmetricq

simple graph

does not allow self edges

acyclic

does not have cycle

how to represent graphs in the computer

either adjacency lists or adjacency matrices

cost of BFS

every vertex reachable from start. mark, enq, dq all O (1). enumerate edges -> θ |E| summed over all vertices assumed adj list total cost θ|v| + |E|

distance

for any vertices v and u, the distance D(v,u) = smallest # of edges on any path from v to u the distance to the self is 0

how to enumerate edges of a vertex

for(Edge e : x.edgesFrom())

adjacency matrix

one row/column for vertex. marked as a 1 if there is an edge between those 2 vertices. if it is a simple graph, there will be 0s in the main diagonal because there are no self edges symmetric across the diagonal for undirected but not true for directed

first after method

given a value v, returns the least element of the set that is ≥ v. If no such element exists, the method should return a special value "notFound". v itself may or may not be in the set. T firstAfter(T v) returnfirstAfterHelper(v, root) T firstAfterHelper(value, root) if root = null return notFound if root.key = value return root if root.key<value returnfirstAfterHelper(value, root.right) if root.key>value if root.left = null return root if root.left != null while root.left.right != null rightChildOfLeft = root.left.right rightOfLeftValue = root.left.right.key if rightOfLeftValue<v return root else return firstAfterHelper(value, rightChildOfLeft) basically need to use a helper method for this so you can update the root. if the root is null, not found. if root = value, then root. if the root is less than v, you want to traverse down the right side of the tree to see if you get anything greater than or equal to. do this with a recursive call on the right child. if the root is greater than the value but the left is null, then it is the root. if the root is not null, then you have to look through the predecessor. while the left subtree's right child is not null ( you want to traverse down until you get the rightmost). then store the right child and value. if you get to the right most and that value is less than v, the root was the least. if not then recursive call on the predecessor

cyclic

has cycle. self edges are cyclic

balance factor

height of right subtree - height of left subtree can compute in θ1 can only be 0,1,-1 in an AVL tree to maintain AVL go down the tree recursively and maintain the heights of nodes that you encounter if you remove/insert, node can change balance factor by at most 1 (make + or - 2 if you put it on the big side or remove from smaller)

order of vertices DFS directed graph

if a directed graph does not have a cycle, we can assign an order to vertices. if u does not equal v, u <v if there exists a path in G from u to v.

prove unique topological order if there is a directed path of |v| -1 vertices that touches all |V| vertices

if a path exists w v-1 edges that is the order bc it connects all of them. if that path does not exist then there has to be saome extra path to connect and that would be more than v-1 so G would not have to be a topological ordering if v-1 edges touch every single node they have to be constrained in that order

if this was a vertex order [v1,...vi−1,vi+1,vi,vi+2,...,vn] prove that (v, v+1) edge does not exist

if ordering is valid then bc this is an edge v1 proceeds vi+1. if they didn't have this constraint then they could be switched if there is a directed edge (vi, vi+1) the this is some vertex btwn that exists and based on the order it does not exist so they can't change the order because there is no node between them

DFS last node topological order

if the last node has no successors then it has the smallest finishing time and should be last in the order

update height method

if there are both subtrees, then the height = max height of the subtrees + 1. if there is only 1 then it is that subtree height + 1. if there is none then height =0 bc leaf height is 0. height of empty tree is -1 this is where the height is initialized

adjacent directed graph

if there is an edge from x to y. only adj if going in that direction.

how do we track next vertex to explore?

maintain a collection of unfinished vertices -each step efficiently find vertex v w the smallest v.dist and remove -vertices may change due to relaxation, changes in one direction (only decrease) -USE A PRIORITY QUEUE so vertices can jump the line -maintain a priority queue of unfinished vertices, keyed on distance -every vertex inserted into priority queue w starting distance -each step, find next vertex by extract min -decreasing v.dist using decreaser object -assume a map D[] from vertices to Decreasers (handles for every vertex) -it's implemented as a heap but the next vertex is not necessarily the child so need to use the decreaser handle

successor of a non leaf node 234

must be in a leaf node

max # of edges for directed graph

n(n-1) which is O n^2

max # of edges for undirected graph

n(n-1)/2 which is O n^2

right rotate method

newRoot = origRoot.left rightChild = newRoot.right newRoot.right = origRoot origRoot.left = rightChild then update both heights

left rotate method

newRoot=origRoot.right leftChild = newRoot.left newRoot.left = origRoot origRoot.right = leftChild then update both heights

in DFS can the child be started but not finished

no, because there are no cycles this should be the first time you are visiting that node

ex of undirected graphs

railroads connecting cities (because can go both ways), pairings for a tennis match

how to code dijkstra's method

while the pq is not empty VertexAndDist currentVertex = pq.extractmin() for(Edge edge : currentVertex.vertex.edgesFrom()) { find the vertex connected to other end of edge Vertex vertexOtherEnd = edge.to; find the decreaser object for that vertex which ypu get from the handles hash map Decreaser<VertexAndDist> handleVertexOtherEnd = this.handles.get(vertexOtherEnd); then take the decreaser's value (a vertexAndDist obj) VertexAndDist vertexOtherEndAndDist = handleVertexOtherEnd.getValue(); if the initial vertex + the weight is less than the current distance if(currentVertex.distance + weights.get(edge) < vertexOtherEndAndDist.distance) { then update the distance to current distance + weight int distance = currentVertex.distance + weights.get(edge); handleVertexOtherEnd.decrease(new VertexAndDist(vertexOtherEndAndDist.vertex, distance)); then store the parent edge this.parentEdges.put(vertexOtherEnd, edge);

to solve negative weights problem

you found the minimum (most negative) edge weight in the graph, min, and added |min| to each edge in the graph, thus making every edge in the graph non-negative.

problem duJour network design

you have a collection of cities on a map, connect them all on a power grid. every city must be connected abstract graph problem: -cities= vertices -transmission lines= edges -pick subset that spans graph (connects all vertices) -adding construction costs: edge btwn vertices u,v has cost w(v,u) >= 0. want the min total cost to connect vertices. pick a set T edges that spans the graph such that w(t) = ∑e∈T w(e) is minimized T is a MST (min spanning tree of G)

dense graphs

θn^2 examples: complete, complete bipartite


Set pelajaran terkait

ACCT 202 - Chapter 11 Learnsmart

View Set

Health Assessment HESI Assignment

View Set

GEOG 335 Legal Aspects of Planning

View Set