Catalyurek, U.V.Aykanat, Cevdet2016-02-082016-02-0819991045-9219http://hdl.handle.net/11693/25158In this work, we show that the standard graph-partitioning-based decomposition of sparse matrices does not reflect the actual communication volume requirement for parallel matrix-vector multiplication. We propose two computational hypergraph models which avoid this crucial deficiency of the graph model. The proposed models reduce the decomposition problem to the well-known hypergraph partitioning problem. The recently proposed successful multilevel framework is exploited to develop a multilevel hypergraph partitioning tool PaToH for the experimental verification of our proposed hypergraph models. Experimental results on a wide range of realistic sparse test matrices confirm the validity of the proposed hypergraph models. In the decomposition of the test matrices, the hypergraph models using PaToH and hMeTiS result in up to 63 percent less communication volume (30 to 38 percent less on the average) than the graph model using MeTiS, while PaToH is only 1.3-2.3 times slower than MeTiS on the average.EnglishComputational hypergraph modelsHypergraph partitioningHypergraph partitioning based decompositionParallel sparce matrix vector multiplicationSparse matricesComputational methodsComputer simulationGraph theoryMatrix algebraVectorsParallel processing systemsHypergraph-partitioning-based decomposition for parallel sparse-matrix vector multiplicationArticle10.1109/71.7808631558-2183