Using PHP Curl requests with NodeJS Api server - Performance?

Gosu

Aspirant
Joined
Aug 6, 2017
Messages
35
Basically what I'm trying to achieve is write all api code in Express on port 3000 or different server api.website.com (node server)
website.com (php website)

Then using PHP curl requests to render views on server side without using front end frameworks like Angular but only jquery where needed such as pagination etc. The reason for me to do this is make it easier to migrate to a different front end technology if needed in the future and easy SEO with php server side rendering. Also i'm not a big fan of JS frameworks or know them too well. I know Angular and react can do server side rendering but like I said I rather spend more time on developing on this particular project than learn new technologies right now.

I don't know if this sounds like a good idea to send multiple curl requests on every page load, any suggestions would be helpful.
 

Ryan Ashbrook

IPS Developer
Joined
Jan 26, 2004
Messages
3,571
I would definitely not recommend that - sending a Curl request, especially multiple, on every page load can slow down loading times as it attempts to fetch the external content. This is particularly apparent if there is a temporary network issue that interrupts the request, or if the receiving server has (and should, mind you) have security policies in place that will block what appears to be a flood of requests from the same IP.

If Curl is the only way you can proceed, then I would recommend caching the result of the request to the local server (either via Redis, File System storage, or otherwise) and then serve that instead. That will save you some load time. You can then expire that cache every thirty minutes or so (your mileage may vary), and let the server refetch the data from the remote server in the event it determines it is "out of date." The only downside to this is it will take that amount of time before the content actually deploys to the live server (unless you clear it manually).

Ultimately, though, I don't think it would be worth the effort.
 

Gosu

Aspirant
Joined
Aug 6, 2017
Messages
35
I would definitely not recommend that - sending a Curl request, especially multiple, on every page load can slow down loading times as it attempts to fetch the external content. This is particularly apparent if there is a temporary network issue that interrupts the request, or if the receiving server has (and should, mind you) have security policies in place that will block what appears to be a flood of requests from the same IP.

If Curl is the only way you can proceed, then I would recommend caching the result of the request to the local server (either via Redis, File System storage, or otherwise) and then serve that instead. That will save you some load time. You can then expire that cache every thirty minutes or so (your mileage may vary), and let the server refetch the data from the remote server in the event it determines it is "out of date." The only downside to this is it will take that amount of time before the content actually deploys to the live server (unless you clear it manually).

Ultimately, though, I don't think it would be worth the effort.

Thanks for confirming it. Thought it wasn't a good idea.

I have a question though what is difference between a normal ajax call and a curl request / file_get_contents. Only difference I can think of, is one is on the client side while other is on the server. Don't front end frameworks do the exact thing on every page load?
 

Ryan Ashbrook

IPS Developer
Joined
Jan 26, 2004
Messages
3,571
Thanks for confirming it. Thought it wasn't a good idea.

I have a question though what is difference between a normal ajax call and a curl request / file_get_contents. Only difference I can think of, is one is on the client side while other is on the server. Don't front end frameworks do the exact thing on every page load?

Not really, no - I think you may be confusing AJAX and Curl, which are two different things. A Curl request is made at the server level (either that being via PHP, Perl, SSH, Node, etc.) to another server - similar to when you visit a website for the first time and it requests the entire page in your browser.

Where-as, an AJAX request is typically made from the client to the local server, which serves as a means to send new content to the client without forcing it to re-render the full page. There are other uses, but this is the most common.

The two can then be used together to achieve a common goal. For example, in the Invision software, if you have an avatar on one site that you also want to use on a site powered by Invision, then you can "remotely link" it to the Invision site. Behind the scenes, however, what happens is an AJAX request is made from the client to the local server telling PHP this. Then PHP, at the server level, will make a Curl request to fetch the remote image from the URL specified by the client, so that it can be pulled and stored on the local storage system, as if the image had been manually uploaded by the user.

But - in thinking about it, if all you're after is some sort of normalized storage location for your data, which can be rendered by anything (should you choose to switch things up a bit) then MySQL or similar would likely be a better choice to store information, as they are designed for that purpose.
 
Top