Welcome!

Join our community of MMORPG enthusiasts and private server developers! By registering, you'll gain access to in-depth discussions on source codes, binaries, and the latest developments in MMORPG server files. Collaborate with like-minded individuals, explore tutorials, and share insights on building and optimizing private servers. Join us today and unlock the full potential of MMORPG server development!

Join Today!

[Question] Best way to check if a website is online [PHP/C#]

Developer
Joined
Jul 28, 2009
Messages
983
Reaction score
133
Location
The Netherlands
Hey,

A few weeks ago i made a system that checks if a website is online in C# with httpwebresponse and request. But it took 1 minute and 40 seconds with 2 threads to check 200 websites. So you never get a exact 5 minute interval. I was wondering if it would be better to create it in php & curl? Or is there a better way to check if the website is online in C#?

Sincerely,

Jamie
 
Do you want to check if port 80 is open or if the website returns contents?
Else you can loop through the websites and use fsockopen to check port 80, you can also take a look at this for multithreaded cURL.

But honestely, PHP would be a bad language for what you want.
 
Do you want to check if port 80 is open or if the website returns contents?
Else you can loop through the websites and use fsockopen to check port 80, you can also take a look at this for multithreaded cURL.

But honestely, PHP would be a bad language for what you want.
I'm not really sure about that, if i am right pingdom uses php and curl to check websites.

https://www.pingdom.com/services/api-documentation-rest/

@Wizcoder, thanks for the link. Looks pretty good.

Pretty smart, i just found a comment that said you could could use a cronjob that checks like 100 websites. Using php, curl and cronjobs seems like a great idea.
 
Thanks, however Asynchronous is actually a start - stop program. Isn't it possible to check multiple websites at the same time? (I know it's possible with multithreading)

It is, asynchronous sockets will not block your main thread and call a function as soon as it's done processing. That way, your main thread is never blocked and the asynchronous sockets run in the managed .NET threadpool.
 
Thanks, however Asynchronous is actually a start - stop program. Isn't it possible to check multiple websites at the same time? (I know it's possible with multithreading)
Asynchronous (in programming) is running multiple actions at the same time, as opposed to synchronous (in programming) where each action sits dormant until the preceding actions have completed. Here's a textual diagram for you:

Synchronous log example:
Code:
actionA running
actionA requesting http://example.com/a
// waiting for request
actionA finished.
actionB running
actionB requesting http://example.com/b
// waiting for request
actionB finished.
actionC running
actionC requesting http://example.com/c
// waiting for request
actionC finished.
All actions are complete!
Notice in the synchronous program log, we have to wait for the request before the next request can be sent. That's called blocking. the CPU is twiddling it's thumbs when it could be working. We can keep the CPU busy (like it wants to be) by having it send all those requests at the same time using our asynchronous program:

Asynchronous log example:
Code:
actionA running
actionB running
actionC running
All actions are running!
actionA requesting http://example.com/a
actionB requesting http://example.com/b
actionC requesting http://example.com/c
// waiting for requests...
actionB finished. // note: the order by the end is first-done first-served.
actionA finished.
actionC finished.
All actions completed!

Since making a request is very inexpensive computation-wise, but very expensive time-wise, we can run 3 requests at the same time on the same process. Using extra threads is very expensive compared to running requests asynchronously on the same process. Multi-threading is a way to do asynchronous programming, but certainly not ideal for this.

One problem with asynchronous logic is knowing when all actions are completed. Sure, they complete faster, but the program runs through and doesn't give us the convenience of putting a Console.WriteLine at the end of our program and having it work as expected. Instead, we need to use code for each action, to tell us when it's done, to add something to a pool of completed actions- a list- an array- a map- whatever u call it in C#. Anyway, when ever an action is completed, you trigger some code to run which checks if all actions are in the map/list/array. If they are, then we run the "All actions completed!" part of the program.

So, it's a bit more complex, but learning to master this pattern will greatly benefit your programming career.
 
Last edited:
u wont get a time equal result on 200 websites, however by using c# u are on a good way already. i bet u are working with a wcf service in this case hihi.
if u want results by almost the same time, let your main thread start 200 threads .... ok, i stop the jokes, but this is the theory way. in practice u can only start as many threads as u want to risk (depends on your maschine) to run it asynchronous and wait your minute and some. however u can hold up the 5 minute interval on the first web request, the others will follow as good as they can.
 
@offtopic Pingdom is an excellent tools I'm using it on daily basis in company I work for. I'm guessing you are aiming at kinda what they did.

@at topic:

PHP, I wouldn't say it's a bad choice, maybe just not the best
Two approaches:
- There are Apache / PHP mods which enable you to fork processes which enables you to create somekind of multi-threading
- Running the script via cron every few minutes. Why ? In one cron loop you can call same script multiple times which in process will react simulate MT - cron does not care, nor wait if the process he did run ended or not, also if I remember correctly any script called from linux's shell with '&' sign at the end forces terminal to not give a fuck about output - correct me if i'm wrong, I'm not an admin.

So approaches are:
- One everlasting PHP script based on forked processes
- PHP cron called script ( multiple times if needed - to pretend they are threads )

To be honest, I've created a solution based on cron approach, which has to run and check thru proxy did Google indexed certain link, cron is running every few minutes, yet I did kinda single threaded, I call the script once per cron loob forcing him to check around 1000 websites at once. In case the script will not finish in time - you know Google bans my proxies pretty fast - the second loop will run script anyways, pushing him another 1000 websites, usually I do have around 5-10 processes of the script running at the same time - packages usually come with 300,000+ links in them.

C# or maybe rather C++ ?
Simple C++ program which could use advantage of multi-threading and running as a daemon on server or C++ with MT no daemon but called often with cron.


Both of those solutions have one more part to implement which is building a sempahore/locking system for your db records to prevent processes from checking same websites at the same time. Pattern should be lock -> read -> check -> write -> release. With lock/read should be conditional based on last check time. Assigning process/fork/thread id to locked records might be usefull or not, I'm sure while speaking plain theory here.



Both those approaches are oriented by last time check, not by how much time your process needs to check the sites, so you can get very precise at time checks.

I hope my blabering would help you a bit.
 
Last edited:
Back