HomeJavaJava Program to Multiply Two Matrices Using Multi-dimensional Arrays

Java Program to Multiply Two Matrices Using Multi-dimensional Arrays

In this Java program, you’ll learn how to Multiply Two Matrices Using Multi-dimensional Arrays using the Java programming language. 

How to Multiply Two Matrices Using Multi-dimensional Arrays using JAVA? 

Example 1: 

RUN CODE SNIPPET
public class Main 
{ 
    public static void main(String[] args)  
    { 
        int r1 = 2, c1 = 3; 
        int r2 = 3, c2 = 2; 
        int[][] firstMatrix = { {3, -2, 5}, {3, 0, 4} }; 
        int[][] secondMatrix = { {2, 3}, {-9, 0}, {0, 4} }; 
        int[][] product = new int[r1][c2]; 
        for(int i = 0; i < r1; i++)  
        { 
            for (int j = 0; j < c2; j++)  
            { 
                for (int k = 0; k < c1; k++)  
                { 
                    product[i][j] += firstMatrix[i][k] * secondMatrix[k][j]; 
                } 
            } 
        } 
        System.out.println("Multiplication of two matrices is: "); 
        for(int[] row : product) { 
            for (int column : row) { 
                System.out.print(column + "    "); 
            } 
            System.out.println(); 
        } 
    } 
}

OUTPUT 

Multiplication of two matrices is:  
24    29     
6    25     

Share:

Leave A Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Java is a popular programming language that is used to develop a wide range of applications. If you are a...
  • Java
  • March 6, 2023
Java is a programming language and computing platform that is used to create various types of applications. It is used...
  • Java
  • March 6, 2023
In this post, you’ll learn how to download and install JUnit in Eclipse so that you can use in your...
  • Java
  • October 9, 2022