`2018-12-04 17:10:00``2018-12-04 18:10:00``Topology, Geometry and Data Seminar - Dingkang Wang``Title: An Improved Cost Function for Hierarchical Cluster Trees Speaker: Dingkang Wang (Ohio State University, CSE) Abstract: Hierarchical clustering has been a popular method in various data analysis applications. It partitions a data set into a hierarchical collection of clusters, and can provide a global view of (cluster) structure behind data across different granularity levels. A hierarchical clustering (HC) of a data set can be naturally represented by a tree, called a HC-tree, where leaves correspond to input data and subtrees rooted at internal nodes correspond to clusters. Many hierarchical clustering algorithms used in practice are developed in a procedure manner. Recently, Dasgupta proposed to study the hierarchical clustering problem from an optimization point of view, and introduced an intuitive cost function for similarity-based hierarchical clustering with nice properties as well as natural approximation algorithms. There since has been several followup work on better approximation algorithms, hardness analysis, and general understanding of the objective functions. We observe that while Dasgupta's cost function is effective at differentiating a good HC- tree from a bad one for a fixed graph, the value of this cost function does not reflect how well an input similarity graph is consistent to a hierarchical structure. In this paper, we present an improved cost function, which is developed based on Dasgupta's cost function, to address this issue. The optimal tree under the new cost function remains the same as the one under Dasgupta's cost function. However, the value of our cost function is more meaningful. For example, the optimal cost of a graph G equals 1 if and only if G has a "perfect HC-structure" in the sense that there exists a HC-tree that is consistent with all triplets relations in G; and the optimal cost will be larger than 1 otherwise. The new way of formulating the cost function also leads to a polynomial time algorithm to compute the optimal cluster tree when the input graph has a perfect HC-structure, or an approximation algorithm when the input graph "almost" has a perfect HC-structure. Finally, we provide further understanding of the new cost function by studying its behavior for random graphs sampled from an edge probability matrix.``Cockins Hall 240``OSU ASC Drupal 8``ascwebservices@osu.edu``America/New_York``public`

`2018-12-04 16:10:00``2018-12-04 17:10:00``Topology, Geometry and Data Seminar - Dingkang Wang``Title: An Improved Cost Function for Hierarchical Cluster Trees Speaker: Dingkang Wang (Ohio State University, CSE) Abstract: Hierarchical clustering has been a popular method in various data analysis applications. It partitions a data set into a hierarchical collection of clusters, and can provide a global view of (cluster) structure behind data across different granularity levels. A hierarchical clustering (HC) of a data set can be naturally represented by a tree, called a HC-tree, where leaves correspond to input data and subtrees rooted at internal nodes correspond to clusters. Many hierarchical clustering algorithms used in practice are developed in a procedure manner. Recently, Dasgupta proposed to study the hierarchical clustering problem from an optimization point of view, and introduced an intuitive cost function for similarity-based hierarchical clustering with nice properties as well as natural approximation algorithms. There since has been several followup work on better approximation algorithms, hardness analysis, and general understanding of the objective functions. We observe that while Dasgupta's cost function is effective at differentiating a good HC- tree from a bad one for a fixed graph, the value of this cost function does not reflect how well an input similarity graph is consistent to a hierarchical structure. In this paper, we present an improved cost function, which is developed based on Dasgupta's cost function, to address this issue. The optimal tree under the new cost function remains the same as the one under Dasgupta's cost function. However, the value of our cost function is more meaningful. For example, the optimal cost of a graph G equals 1 if and only if G has a "perfect HC-structure" in the sense that there exists a HC-tree that is consistent with all triplets relations in G; and the optimal cost will be larger than 1 otherwise. The new way of formulating the cost function also leads to a polynomial time algorithm to compute the optimal cluster tree when the input graph has a perfect HC-structure, or an approximation algorithm when the input graph "almost" has a perfect HC-structure. Finally, we provide further understanding of the new cost function by studying its behavior for random graphs sampled from an edge probability matrix.``Cockins Hall 240``Department of Mathematics``math@osu.edu``America/New_York``public`**Title**: An Improved Cost Function for Hierarchical Cluster Trees

**Speaker**: Dingkang Wang (Ohio State University, CSE)

**Abstract**: Hierarchical clustering has been a popular method in various data analysis applications. It partitions a data set into a hierarchical collection of clusters, and can provide a global view of (cluster) structure behind data across different granularity levels. A hierarchical clustering (HC) of a data set can be naturally represented by a tree, called a HC-tree, where leaves correspond to input data and subtrees rooted at internal nodes correspond to clusters. Many hierarchical clustering algorithms used in practice are developed in a procedure manner. Recently, Dasgupta proposed to study the hierarchical clustering problem from an optimization point of view, and introduced an intuitive cost function for similarity-based hierarchical clustering with nice properties as well as natural approximation algorithms. There since has been several followup work on better approximation algorithms, hardness analysis, and general understanding of the objective functions.

We observe that while Dasgupta's cost function is effective at differentiating a good HC- tree from a bad one for a fixed graph, the value of this cost function does not reflect how well an input similarity graph is consistent to a hierarchical structure. In this paper, we present an improved cost function, which is developed based on Dasgupta's cost function, to address this issue. The optimal tree under the new cost function remains the same as the one under Dasgupta's cost function. However, the value of our cost function is more meaningful. For example, the optimal cost of a graph G equals 1 if and only if G has a "perfect HC-structure" in the sense that there exists a HC-tree that is consistent with all triplets relations in G; and the optimal cost will be larger than 1 otherwise. The new way of formulating the cost function also leads to a polynomial time algorithm to compute the optimal cluster tree when the input graph has a perfect HC-structure, or an approximation algorithm when the input graph "almost" has a perfect HC-structure. Finally, we provide further understanding of the new cost function by studying its behavior for random graphs sampled from an edge probability matrix.