Recursive bipartitioning models for performance improvement in sparse matrix computations

buir.advisorAykanat, Cevdet
dc.contributor.authorAcer, Seher
dc.date.accessioned2017-09-08T11:01:36Z
dc.date.available2017-09-08T11:01:36Z
dc.date.copyright2017-08
dc.date.issued2017-08
dc.date.submitted2017-09-07
dc.descriptionCataloged from PDF version of articleen_US
dc.descriptionThesis (Ph.D.): Bilkent University, Department of Computer Engineering, İhsan Doğramacı Bilkent University, 2017.en_US
dc.descriptionIncludes bibliographical references (leaves 144-151).en_US
dc.description.abstractSparse matrix computations are among the most important building blocks of linear algebra and arise in many scienti c and engineering problems. Depending on the problem type, these computations may be in the form of sparse matrix dense matrix multiplication (SpMM), sparse matrix vector multiplication (SpMV), or factorization of a sparse symmetric matrix. For both SpMM and SpMV performed on distributed-memory architectures, the associated data and task partitions among processors a ect the parallel performance in a great extent, especially for the sparse matrices with an irregular sparsity pattern. Parallel SpMM is characterized by high volumes of data communicated among processors, whereas both the volume and number of messages are important for parallel SpMV. For the factorization performed in envelope methods, the envelope size (i.e., pro le) is an important factor which determines the performance. For improving the performance in each of these sparse matrix computations, we propose graph/hypergraph partitioning models that exploit the advantages provided by the recursive bipartitioning (RB) paradigm in order to meet the speci c needs of the respective computation. In the models proposed for SpMM and SpMV, we utilize the RB process to enable targeting multiple volume-based communication cost metrics and the combination of volume- and number-based communication cost metrics in their partitioning objectives, respectively. In the model proposed for the factorization in envelope methods, the input matrix is reordered by utilizing the RB process in which two new quality metrics relating to pro le minimization are de ned and maintained. The experimantal results show that the proposed RB-based approach outperforms the state-of-the-art for each mentioned computation.en_US
dc.description.provenanceSubmitted by Betül Özen (ozen@bilkent.edu.tr) on 2017-09-08T11:01:36Z No. of bitstreams: 1 10163144.pdf: 2154022 bytes, checksum: c440695fc320371d282c791d569ba3c8 (MD5)en
dc.description.provenanceMade available in DSpace on 2017-09-08T11:01:36Z (GMT). No. of bitstreams: 1 10163144.pdf: 2154022 bytes, checksum: c440695fc320371d282c791d569ba3c8 (MD5) Previous issue date: 2017-09en
dc.description.statementofresponsibilityby Seher Acer.en_US
dc.embargo.release2020-09-06
dc.format.extentxv, 151 leaves : charts (some color) ; 30 cmen_US
dc.identifier.itemidB156134
dc.identifier.urihttp://hdl.handle.net/11693/33583
dc.language.isoEnglishen_US
dc.rightsinfo:eu-repo/semantics/openAccessen_US
dc.subjectSparse matricesen_US
dc.subjectRecursive bipartitioningen_US
dc.subjectGraph partitioningen_US
dc.subjectHypergraph partitioningen_US
dc.subjectDistributed-memory architecturesen_US
dc.subjectCommunication costen_US
dc.subjectEnvelope methodsen_US
dc.subjectFactorizationen_US
dc.subjectPro le reductionen_US
dc.titleRecursive bipartitioning models for performance improvement in sparse matrix computationsen_US
dc.title.alternativeSeyrek matris hesaplamalarında performans iyileşmesi için özyinelemeli ikiye bölümleme modellerien_US
dc.typeThesisen_US
thesis.degree.disciplineComputer Engineering
thesis.degree.grantorBilkent University
thesis.degree.levelDoctoral
thesis.degree.namePh.D. (Doctor of Philosophy)

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
10163144.pdf
Size:
2.05 MB
Format:
Adobe Portable Document Format
Description:
Full printable version

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description: