Knowledgebase

Generate HTML/Static Pages from Dynamic Php Page

Posted by TheRealDeal, 05-05-2005, 11:01 PM
Is there a script to generate from dynamic content: eg. http://www.domain.com/list.php?member=sample to: http://www.domain.com/sample.html (Actual file and not mod_rewrite) Set a cron job to build this every 15 minutes or so. The reason for this is: Less calls to the MySQL database from dynamic content.

Posted by almahdi, 05-06-2005, 06:03 AM
You may use this code: Now you can generate as much static files as you want, by just using this function.

Posted by total_assault, 05-06-2005, 09:29 AM
Don't forget to CHMOD the folder to 777.

Posted by TheRealDeal, 05-06-2005, 03:32 PM
Thank you so much almahdi! fread($fp,1024); Can this handle big files? 1024 is that about 1 meg?

Posted by almahdi, 05-06-2005, 04:17 PM
Yes, it can handle a big file.. 1024 is the size in bytes.. where its equal to 1Kbyte.. Give it a try, and if you face anything wrong, I can modify the code to suite your needs.

Posted by TheRealDeal, 05-06-2005, 05:00 PM
If I put fread($fp,1024*8); will this work?

Posted by almahdi, 05-07-2005, 11:53 AM
Hmm, I don't suggest so, the file reading isn't limited to 1Kb, as you can see that fread is looped until the end of the file, so it should work with any file size.. Just use the code as it is, and if you face anything, just post back.

Posted by BurstChris, 05-12-2005, 12:12 AM
No need to over complicate things and reinvent the wheel. wget, fetch, curl, lynx set via a cronjob... */15 * * * * /usr/bin/wget -O /some/location/sample.html location.com/sample.cgi fetch -o /some/location/sample.html location.com/sample.cgi lynx -source location.com/sample.cgi > /some/location/sample.html curl -o /some/location/sample.html location.com/sample.cgi if you have perl's lwp modules installed, you can use GET: GET /sample.cgi location.com > /some/location/sample.html note, this site is not letting me put the http : // on things as it's parser thinks I am submitting web links. so http : // is missing in obvious places.

Posted by TheRealDeal, 05-16-2005, 08:03 PM
What if I want to do more than 1 URL; Say 50.

Posted by BurstChris, 05-17-2005, 09:53 AM
you can put them all manually in a shell script and then run that shell script from the crontab. probably can use awk and shell to generate this shell script if you have a list of urls, *shrug*

Posted by innosys, 10-12-2005, 12:34 PM
in oder to understant this issue i have do the folowing file1 : test1.php -------------------- New document File2: test.php -------------------- after runnig test.php all i get is this file: test1.html ------------------- New document what is wrong with my work Last edited by innosys; 10-12-2005 at 12:49 PM.

Posted by michael-lane, 10-13-2005, 03:47 AM
or you could fread($fp, filesize($fp)) that actually checks the filesize of the file and does it automatically.

Posted by innosys, 10-13-2005, 07:36 AM
test1.html must have the result of the command phpinfo()

Posted by aht, 11-02-2005, 09:36 AM
Sorry to thread jack, but this is a related question. What if I have a page that only work in Internet Explorer (it is dynamic) and I want to do something similar (as to what is being done above) and make it into a static page that will be readable in any browser (as pure HTML). It would also not need to be auto updated, so I was thinking about going to cron job, but I am not sure if linux will read this or not. One thing I am think is using the PHP script above (if I can get it to work) to generate a "live" static page that would not be visable by the view, but the have the cron job take that page and make it into a page that is updated everyonce in a while. I am using XML Data Binding and it only works in IE. Might have to try something else, but it works so well it sucks that Firefox doesn't support it. Thanks

Posted by IIyama, 03-10-2006, 08:23 AM
Hello there, Thanks for the script, it works fine. But can this script also read from a file.txt where I store all urls for this script ? Or do you have other solutions so I don't have to copy all url manual in this script ?

Posted by xr77, 06-12-2007, 10:20 AM
Hello .... Did you find a solution. I would like to do the same thing. Generate a static html file from a processed php file.

Posted by pcgc1xn, 06-19-2007, 08:56 PM
I was using the fread technique described above until my hosting provider moved to phpsuexec (though there may have been some other change at the same time). I too got the problem that the php files were not being interpreted, so I would just get a 'static' file of the php code. Anyway, I finally figured out/read that this works: Step 1 - set up your cron as follows: /usr/bin/php -q /home/directory/cronjobs/test1.php > /home/directory/includes/test1.html the -q removes the headings from the output of the cron. Obviously you will need to check directories etc. The whole file with fread, fopen etc is now unnecessary. I am not sure if there is a downside to doing it like this, but it works for me.

Posted by I5KI Bones, 04-16-2008, 07:04 PM
I have a question. Will this work with an upload script, because I need to make an html page for each file uploaded by users. If not can someone point me in the right direction?

Posted by rallport, 12-18-2010, 09:28 AM
Could also use pear cache_lite?



Was this answer helpful?

Add to Favourites Add to Favourites

Print this Article Print this Article

Also Read
lytenhost down (Views: 604)
This is ... weird (Views: 563)


Language:

Client Login

Email

Password

Remember Me

Search