PHP: Match URL and Fetch HTML of All Pages of an External Domain

I am using the following PHP code to fetch HTML from a website using file_get_contents URL, but only the front page. Now I want to map all the URLs and fetch all the requested pages without saving them in a database. It’s like coloning the whole www.theidioms.com domain with www.theidioms.in and replacing the permalinks using str_replace(). The purpose of the script is to create multiple language domains.

<?php
$url = file_get_contents('https://www.theidioms.com/');

 echo  str_replace(array(
     'The Idioms Dictionary explains common English idioms that are popular worldwide, especially in the United States, Canada, the United Kingdom, Australia, Singapore, and New Zealand.',
     'International',
     'Global Site',
     'Our locations',
     'globally',
     'worldwide',
     '<ul><li>United States</li><li>Canada</li><li>United Kingdom</li><li>Australia</li><li>New Zealand</li><li>Singapore</li></ul>',
     ' lang="en"',
     '<link rel="canonical" href="https://www.theidioms.com/" />',
     'fulfill',
     ),
     array(
     'Theidioms.in Dictionary explains common English idioms that are popular in India and the Indian subcontinent.',
     '(India)',
     'India',
     'Location',
     'in India',
     'in India',
     '<h4>We love India.</h4>',
     ' lang="en-in"',
     '<link rel="canonical" href="https://theidioms.in/" />',
     'fulfil',
     ),
     
     $url);
?>