79810226

Date: 2025-11-05 14:51:21
Score: 0.5
Natty:
Report link

Reading files with PHP is a slow thing.

First, your PHP code will be converted to machine language and executed, this itself is slow.

Next, your PHP code will have to check if "file_exists" and then read it in someway, like with file_get_contents, or reading it line by line.

"file_get_contents" will run into memory issues on large files. Also reading many files to look for a certain thing will mean you will have to open all files to find that one thing.

Say for example you have a million registered users, each user's info is stored in his/her own file.

Now you want to find out names of all male users who are living in California.

You will need to open each user file and read the data to find what you are looking for, open million files takes too much time with PHP.

In MYSQL, this is one query, will probably take less than a few seconds to execute.

What you need is a mix of both PHP file system and MYSQL.

All the data which is usually accessed again and again, you can store that particular data in a file and open that file using PHP. And all the data also goes into mysql for use with complex queries.

For example, you have file of user "JOHN_SMITH.txt" stored somewhere outside public_html.

And on website another user wants to see info about JOHN_SMITH on webpage example.com/profile/JOHN_SMITH/. So your PHP code simply opens that one file and displays the info. Takes like a second to complete that task, and no need to touch MYSQL, leaving it free for more complex searches.

Also, you need to have a really big database table and extremely high traffic before speed becomes an issue. If it has become an issue in your case with only little extra traffic, then either server is too weak, has less processing power, or your database is not formed/indexed properly.

Reasons:
  • Whitelisted phrase (-1): in your case
  • Contains signature (1):
  • Long answer (-1):
  • No code block (0.5):
  • Low reputation (1):
Posted by: John