Learning Guarantee of SDP Relaxation on Directed Stochastic Block Models

Yanlang Chen

Abstract


Community detection in networks has been a focal point in various scientific domains, but the study of directed networks remains relatively under-explored despite their prevalence and importance in capturing real-world systems. This work addresses this research gap by focusing on Directed Stochastic Block Models (DSBMs), a natural extension of traditional Stochastic Block Models (SBMs) to directed graphs. The inherent complexity of directionality in DSBMs makes them challenging to analyze, requiring new mathematical frameworks and computational approaches. We introduce an augmented matrix to encapsulate the directional relationships within these networks, providing a nuanced perspective for further analysis. In this work, we prove the information-theoretical threshold for exact recovery in the DSBMs and propose an SDP relaxation that can achieve this threshold, thereby contributing to the theoretical understanding of community detection in the realm of directed graphs.


Full Text:

PDF


DOI: https://doi.org/10.22158/asir.v8n1p63

Refbacks

  • There are currently no refbacks.


Copyright (c) 2024 Yanlang Chen

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © SCHOLINK INC.   ISSN 2474-4972 (Print)    ISSN 2474-4980 (Online)