Welcome!

Join our community of MMO enthusiasts and game developers! By registering, you'll gain access to discussions on the latest developments in MMO server files and collaborate with like-minded individuals. Join us today and unlock the potential of MMO server development!

Join Today!

[Question] Best way to check if a website is online [PHP/C#]

Developer
Member
Joined
Jul 28, 2009
Messages
983
Reaction score
133
Hey,

A few weeks ago i made a system that checks if a website is online in C# with httpwebresponse and request. But it took 1 minute and 40 seconds with 2 threads to check 200 websites. So you never get a exact 5 minute interval. I was wondering if it would be better to create it in php & curl? Or is there a better way to check if the website is online in C#?

Sincerely,

Jamie
 
Pee Aitch Pee
Joined
Mar 30, 2011
Messages
630
Reaction score
422
Do you want to check if port 80 is open or if the website returns contents?
Else you can loop through the websites and use fsockopen to check port 80, you can also take a look at for multithreaded cURL.

But honestely, PHP would be a bad language for what you want.
 
Developer
Member
Joined
Jul 28, 2009
Messages
983
Reaction score
133
Do you want to check if port 80 is open or if the website returns contents?
Else you can loop through the websites and use fsockopen to check port 80, you can also take a look at for multithreaded cURL.

But honestely, PHP would be a bad language for what you want.
I'm not really sure about that, if i am right pingdom uses php and curl to check websites.



@Wizcoder, thanks for the link. Looks pretty good.

Pretty smart, i just found a comment that said you could could use a cronjob that checks like 100 websites. Using php, curl and cronjobs seems like a great idea.
 
Praise the Sun!
Member
Joined
Dec 4, 2007
Messages
2,502
Reaction score
986
Asynchronous sockets on C# would be your best bet I think. Something like , except you're not making a port scanner but a IP:port scanner.
 
Developer
Member
Joined
Jul 28, 2009
Messages
983
Reaction score
133
Asynchronous sockets on C# would be your best bet I think. Something like , except you're not making a port scanner but a IP:port scanner.
Thanks, however Asynchronous is actually a start - stop program. Isn't it possible to check multiple websites at the same time? (I know it's possible with multithreading)
 
Praise the Sun!
Member
Joined
Dec 4, 2007
Messages
2,502
Reaction score
986
Thanks, however Asynchronous is actually a start - stop program. Isn't it possible to check multiple websites at the same time? (I know it's possible with multithreading)

It is, asynchronous sockets will not block your main thread and call a function as soon as it's done processing. That way, your main thread is never blocked and the asynchronous sockets run in the managed .NET threadpool.
 
Joined
Jun 8, 2007
Messages
1,985
Reaction score
490
Thanks, however Asynchronous is actually a start - stop program. Isn't it possible to check multiple websites at the same time? (I know it's possible with multithreading)
Asynchronous (in programming) is running multiple actions at the same time, as opposed to synchronous (in programming) where each action sits dormant until the preceding actions have completed. Here's a textual diagram for you:

Synchronous log example:
Code:
actionA running
actionA requesting http://example.com/a
// waiting for request
actionA finished.
actionB running
actionB requesting http://example.com/b
// waiting for request
actionB finished.
actionC running
actionC requesting http://example.com/c
// waiting for request
actionC finished.
All actions are complete!
Notice in the synchronous program log, we have to wait for the request before the next request can be sent. That's called blocking. the CPU is twiddling it's thumbs when it could be working. We can keep the CPU busy (like it wants to be) by having it send all those requests at the same time using our asynchronous program:

Asynchronous log example:
Code:
actionA running
actionB running
actionC running
All actions are running!
actionA requesting http://example.com/a
actionB requesting http://example.com/b
actionC requesting http://example.com/c
// waiting for requests...
actionB finished. // note: the order by the end is first-done first-served.
actionA finished.
actionC finished.
All actions completed!

Since making a request is very inexpensive computation-wise, but very expensive time-wise, we can run 3 requests at the same time on the same process. Using extra threads is very expensive compared to running requests asynchronously on the same process. Multi-threading is a way to do asynchronous programming, but certainly not ideal for this.

One problem with asynchronous logic is knowing when all actions are completed. Sure, they complete faster, but the program runs through and doesn't give us the convenience of putting a Console.WriteLine at the end of our program and having it work as expected. Instead, we need to use code for each action, to tell us when it's done, to add something to a pool of completed actions- a list- an array- a map- whatever u call it in C#. Anyway, when ever an action is completed, you trigger some code to run which checks if all actions are in the map/list/array. If they are, then we run the "All actions completed!" part of the program.

So, it's a bit more complex, but learning to master this pattern will greatly benefit your programming career.
 
Last edited:
• ♠️​ ♦️ ♣️ ​♥️ •
Joined
Mar 25, 2012
Messages
909
Reaction score
464
u wont get a time equal result on 200 websites, however by using c# u are on a good way already. i bet u are working with a wcf service in this case hihi.
if u want results by almost the same time, let your main thread start 200 threads .... ok, i stop the jokes, but this is the theory way. in practice u can only start as many threads as u want to risk (depends on your maschine) to run it asynchronous and wait your minute and some. however u can hold up the 5 minute interval on the first web request, the others will follow as good as they can.
 
hello
Member
Joined
Jun 24, 2004
Messages
726
Reaction score
158
@offtopic Pingdom is an excellent tools I'm using it on daily basis in company I work for. I'm guessing you are aiming at kinda what they did.

@at topic:

PHP, I wouldn't say it's a bad choice, maybe just not the best
Two approaches:
- There are Apache / PHP mods which enable you to fork processes which enables you to create somekind of multi-threading
- Running the script via cron every few minutes. Why ? In one cron loop you can call same script multiple times which in process will react simulate MT - cron does not care, nor wait if the process he did run ended or not, also if I remember correctly any script called from linux's shell with '&' sign at the end forces terminal to not give a duck about output - correct me if i'm wrong, I'm not an admin.

So approaches are:
- One everlasting PHP script based on forked processes
- PHP cron called script ( multiple times if needed - to pretend they are threads )

To be honest, I've created a solution based on cron approach, which has to run and check thru proxy did Google indexed certain link, cron is running every few minutes, yet I did kinda single threaded, I call the script once per cron loob forcing him to check around 1000 websites at once. In case the script will not finish in time - you know Google bans my proxies pretty fast - the second loop will run script anyways, pushing him another 1000 websites, usually I do have around 5-10 processes of the script running at the same time - packages usually come with 300,000+ links in them.

C# or maybe rather C++ ?
Simple C++ program which could use advantage of multi-threading and running as a daemon on server or C++ with MT no daemon but called often with cron.


Both of those solutions have one more part to implement which is building a sempahore/locking system for your db records to prevent processes from checking same websites at the same time. Pattern should be lock -> read -> check -> write -> release. With lock/read should be conditional based on last check time. Assigning process/fork/thread id to locked records might be usefull or not, I'm sure while speaking plain theory here.



Both those approaches are oriented by last time check, not by how much time your process needs to check the sites, so you can get very precise at time checks.

I hope my blabering would help you a bit.
 
Last edited:
Back
Top