Discussion:
high traffic websites
Negin Nickparsa
2013-09-18 06:10:02 UTC
Permalink
In general, what are the best ways to handle high traffic websites?

VPS(clouds)?
web analyzers?
dedicated servers?
distributed memory cache?


Sincerely
Negin Nickparsa
Sebastian Krebs
2013-09-18 07:15:39 UTC
Permalink
Post by Negin Nickparsa
In general, what are the best ways to handle high traffic websites?
VPS(clouds)?
web analyzers?
dedicated servers?
distributed memory cache?
Yes :)

But seriously: That is a topic most of us spent much time to get into it.
You can explain it with a bunch of buzzwords. Additional, how do you define
"high traffic websites"? Do you already _have_ such a site? Or do you
_want_ it? It's important, because I've seen it far too often, that
projects spent too much effort in their "high traffic infrastructure" and
at the end it wasn't that high traffic ;) I wont say, that you cannot be
successfull, but you should start with an effort you can handle.

Regards,
Sebastian
Post by Negin Nickparsa
Sincerely
Negin Nickparsa
--
github.com/KingCrunch
Negin Nickparsa
2013-09-18 07:38:22 UTC
Permalink
Thank you Sebastian..actually I will already have one if qualified for the
job. Yes, and I may fail to handle it that's why I asked for guidance.
I wanted some tidbits to start over. I have searched through yslow,
HTTtrack and others.
I have searched through php list in my email too before asking this
question. it is kind of beneficial for all people and not has been asked
directly.


Sincerely
Negin Nickparsa
Post by Sebastian Krebs
Post by Negin Nickparsa
In general, what are the best ways to handle high traffic websites?
VPS(clouds)?
web analyzers?
dedicated servers?
distributed memory cache?
Yes :)
But seriously: That is a topic most of us spent much time to get into it.
You can explain it with a bunch of buzzwords. Additional, how do you define
"high traffic websites"? Do you already _have_ such a site? Or do you
_want_ it? It's important, because I've seen it far too often, that
projects spent too much effort in their "high traffic infrastructure" and
at the end it wasn't that high traffic ;) I wont say, that you cannot be
successfull, but you should start with an effort you can handle.
Regards,
Sebastian
Post by Negin Nickparsa
Sincerely
Negin Nickparsa
--
github.com/KingCrunch
Camilo Sperberg
2013-09-18 09:20:39 UTC
Permalink
Post by Negin Nickparsa
Thank you Sebastian..actually I will already have one if qualified for the
job. Yes, and I may fail to handle it that's why I asked for guidance.
I wanted some tidbits to start over. I have searched through yslow,
HTTtrack and others.
I have searched through php list in my email too before asking this
question. it is kind of beneficial for all people and not has been asked
directly.
Sincerely
Negin Nickparsa
Post by Sebastian Krebs
Post by Negin Nickparsa
In general, what are the best ways to handle high traffic websites?
VPS(clouds)?
web analyzers?
dedicated servers?
distributed memory cache?
Yes :)
But seriously: That is a topic most of us spent much time to get into it.
You can explain it with a bunch of buzzwords. Additional, how do you define
"high traffic websites"? Do you already _have_ such a site? Or do you
_want_ it? It's important, because I've seen it far too often, that
projects spent too much effort in their "high traffic infrastructure" and
at the end it wasn't that high traffic ;) I wont say, that you cannot be
successfull, but you should start with an effort you can handle.
Regards,
Sebastian
Post by Negin Nickparsa
Sincerely
Negin Nickparsa
--
github.com/KingCrunch
Your question is way too vague to be answered properly... My best guess would be that it depends severely on the type of website you have and how's the current implementation being well... implemented.

Simply said: what works for Facebook may/will not work for linkedIn, twitter or Google, mainly because the type of search differs A LOT: facebook is about relations between people, twitter is about small pieces of data not mainly interconnected between each other, while Google is all about links and all type of content: from little pieces of information through whole Wikipedia.

You could start by studying how varnish and redis/memcached works, you could study about how proxies work (nginx et al), CDNs and that kind of stuff, but if you want more specific answers, you could better ask specific question.

In the PHP area, an opcode cache does the job very well and can accelerate the page load by several orders of magnitude, I recommend OPCache, which is already included in PHP 5.5.

Greetings.
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
Negin Nickparsa
2013-09-18 11:50:52 UTC
Permalink
Thank you Camilo

to be more in details,suppose the website has 80,000 users and each page
takes 200 ms to be rendered and you have thousand hits in a second so we
want to reduce the time of rendering. is there any way to reduce the
rendering time?

other thing is suppose they want to upload files simultaneously and the
videos are in the website not on another server like YouTube and so streams
are really consuming the bandwidth.

Also,It is troublesome to get backups,when getting backups you have problem
of lock backing up with bulk of data.



Sincerely
Negin Nickparsa
Post by Sebastian Krebs
Post by Negin Nickparsa
Thank you Sebastian..actually I will already have one if qualified for
the
Post by Negin Nickparsa
job. Yes, and I may fail to handle it that's why I asked for guidance.
I wanted some tidbits to start over. I have searched through yslow,
HTTtrack and others.
I have searched through php list in my email too before asking this
question. it is kind of beneficial for all people and not has been asked
directly.
Sincerely
Negin Nickparsa
Post by Sebastian Krebs
Post by Negin Nickparsa
In general, what are the best ways to handle high traffic websites?
VPS(clouds)?
web analyzers?
dedicated servers?
distributed memory cache?
Yes :)
But seriously: That is a topic most of us spent much time to get into
it.
Post by Negin Nickparsa
Post by Sebastian Krebs
You can explain it with a bunch of buzzwords. Additional, how do you
define
Post by Negin Nickparsa
Post by Sebastian Krebs
"high traffic websites"? Do you already _have_ such a site? Or do you
_want_ it? It's important, because I've seen it far too often, that
projects spent too much effort in their "high traffic infrastructure"
and
Post by Negin Nickparsa
Post by Sebastian Krebs
at the end it wasn't that high traffic ;) I wont say, that you cannot be
successfull, but you should start with an effort you can handle.
Regards,
Sebastian
Post by Negin Nickparsa
Sincerely
Negin Nickparsa
--
github.com/KingCrunch
Your question is way too vague to be answered properly... My best guess
would be that it depends severely on the type of website you have and how's
the current implementation being well... implemented.
Simply said: what works for Facebook may/will not work for linkedIn,
facebook is about relations between people, twitter is about small pieces
of data not mainly interconnected between each other, while Google is all
about links and all type of content: from little pieces of information
through whole Wikipedia.
You could start by studying how varnish and redis/memcached works, you
could study about how proxies work (nginx et al), CDNs and that kind of
stuff, but if you want more specific answers, you could better ask specific
question.
In the PHP area, an opcode cache does the job very well and can accelerate
the page load by several orders of magnitude, I recommend OPCache, which is
already included in PHP 5.5.
Greetings.
Sebastian Krebs
2013-09-18 12:09:53 UTC
Permalink
Post by Negin Nickparsa
Thank you Camilo
to be more in details,suppose the website has 80,000 users and each page
takes 200 ms to be rendered and you have thousand hits in a second so we
want to reduce the time of rendering. is there any way to reduce the
rendering time?
Read about frontend-/proxy-caching (Nginx, Varnish) and ESI/SSI-include
(also NGinx and Varnish ;)). The idea is simply "If you don't have to
process on every request in the backend, don't process it in the backend on
every request".

But maybe you mixed up some words, because the rendering time is the time
consumed by the renderer within the browser (HTML and CSS). This you can
improve, if you improve your HTML/CSS :)


I am a little bit curious: Do you _really_ have 1000 requests/second, or do
you just throw some numbers in? ;)
Post by Negin Nickparsa
other thing is suppose they want to upload files simultaneously and the
videos are in the website not on another server like YouTube and so streams
are really consuming the bandwidth.
Well, if there are streams, there are streams. I cannot imagine, that there
is another way someone can stream a video without downloading it.
Post by Negin Nickparsa
Also,It is troublesome to get backups,when getting backups you have
problem of lock backing up with bulk of data.
Even in times, where there is not that much traffix? Automatic backup at
3:00 in the morning for example?
Post by Negin Nickparsa
Sincerely
Negin Nickparsa
Post by Sebastian Krebs
Post by Negin Nickparsa
Thank you Sebastian..actually I will already have one if qualified for
the
Post by Negin Nickparsa
job. Yes, and I may fail to handle it that's why I asked for guidance.
I wanted some tidbits to start over. I have searched through yslow,
HTTtrack and others.
I have searched through php list in my email too before asking this
question. it is kind of beneficial for all people and not has been asked
directly.
Sincerely
Negin Nickparsa
Post by Sebastian Krebs
Post by Negin Nickparsa
In general, what are the best ways to handle high traffic websites?
VPS(clouds)?
web analyzers?
dedicated servers?
distributed memory cache?
Yes :)
But seriously: That is a topic most of us spent much time to get into
it.
Post by Negin Nickparsa
Post by Sebastian Krebs
You can explain it with a bunch of buzzwords. Additional, how do you
define
Post by Negin Nickparsa
Post by Sebastian Krebs
"high traffic websites"? Do you already _have_ such a site? Or do you
_want_ it? It's important, because I've seen it far too often, that
projects spent too much effort in their "high traffic infrastructure"
and
Post by Negin Nickparsa
Post by Sebastian Krebs
at the end it wasn't that high traffic ;) I wont say, that you cannot
be
Post by Negin Nickparsa
Post by Sebastian Krebs
successfull, but you should start with an effort you can handle.
Regards,
Sebastian
Post by Negin Nickparsa
Sincerely
Negin Nickparsa
--
github.com/KingCrunch
Your question is way too vague to be answered properly... My best guess
would be that it depends severely on the type of website you have and how's
the current implementation being well... implemented.
Simply said: what works for Facebook may/will not work for linkedIn,
facebook is about relations between people, twitter is about small pieces
of data not mainly interconnected between each other, while Google is all
about links and all type of content: from little pieces of information
through whole Wikipedia.
You could start by studying how varnish and redis/memcached works, you
could study about how proxies work (nginx et al), CDNs and that kind of
stuff, but if you want more specific answers, you could better ask specific
question.
In the PHP area, an opcode cache does the job very well and can
accelerate the page load by several orders of magnitude, I recommend
OPCache, which is already included in PHP 5.5.
Greetings.
--
github.com/KingCrunch
Negin Nickparsa
2013-09-18 13:04:27 UTC
Permalink
I am a little bit curious: Do you _really_ have 1000 requests/second, or do
you just throw some numbers in? ;)

Sebastian, supposedly_asking_to_get_some_pre_evaluation :)

Even in times, where there is not that much traffix? Automatic backup at
3:00 in the morning for example?

3:00 morning in one country is 9 Am in other country, 3 PM in other country
.

By the way Thank you so much guys, I wanted tidbits and you gave me more.

Stuart, I recall your replies in other situations and always you helped me
to improve.list is happy to have you.

Sincerely
Negin Nickparsa
Post by Sebastian Krebs
Post by Negin Nickparsa
Thank you Camilo
to be more in details,suppose the website has 80,000 users and each page
takes 200 ms to be rendered and you have thousand hits in a second so we
want to reduce the time of rendering. is there any way to reduce the
rendering time?
Read about frontend-/proxy-caching (Nginx, Varnish) and ESI/SSI-include
(also NGinx and Varnish ;)). The idea is simply "If you don't have to
process on every request in the backend, don't process it in the backend on
every request".
But maybe you mixed up some words, because the rendering time is the time
consumed by the renderer within the browser (HTML and CSS). This you can
improve, if you improve your HTML/CSS :)
I am a little bit curious: Do you _really_ have 1000 requests/second, or
do you just throw some numbers in? ;)
Post by Negin Nickparsa
other thing is suppose they want to upload files simultaneously and the
videos are in the website not on another server like YouTube and so streams
are really consuming the bandwidth.
Well, if there are streams, there are streams. I cannot imagine, that
there is another way someone can stream a video without downloading it.
Post by Negin Nickparsa
Also,It is troublesome to get backups,when getting backups you have
problem of lock backing up with bulk of data.
Even in times, where there is not that much traffix? Automatic backup at
3:00 in the morning for example?
Post by Negin Nickparsa
Sincerely
Negin Nickparsa
Post by Sebastian Krebs
Post by Negin Nickparsa
Thank you Sebastian..actually I will already have one if qualified for
the
Post by Negin Nickparsa
job. Yes, and I may fail to handle it that's why I asked for guidance.
I wanted some tidbits to start over. I have searched through yslow,
HTTtrack and others.
I have searched through php list in my email too before asking this
question. it is kind of beneficial for all people and not has been
asked
Post by Negin Nickparsa
directly.
Sincerely
Negin Nickparsa
Post by Sebastian Krebs
Post by Negin Nickparsa
In general, what are the best ways to handle high traffic websites?
VPS(clouds)?
web analyzers?
dedicated servers?
distributed memory cache?
Yes :)
But seriously: That is a topic most of us spent much time to get into
it.
Post by Negin Nickparsa
Post by Sebastian Krebs
You can explain it with a bunch of buzzwords. Additional, how do you
define
Post by Negin Nickparsa
Post by Sebastian Krebs
"high traffic websites"? Do you already _have_ such a site? Or do you
_want_ it? It's important, because I've seen it far too often, that
projects spent too much effort in their "high traffic infrastructure"
and
Post by Negin Nickparsa
Post by Sebastian Krebs
at the end it wasn't that high traffic ;) I wont say, that you cannot
be
Post by Negin Nickparsa
Post by Sebastian Krebs
successfull, but you should start with an effort you can handle.
Regards,
Sebastian
Post by Negin Nickparsa
Sincerely
Negin Nickparsa
--
github.com/KingCrunch
Your question is way too vague to be answered properly... My best guess
would be that it depends severely on the type of website you have and how's
the current implementation being well... implemented.
Simply said: what works for Facebook may/will not work for linkedIn,
facebook is about relations between people, twitter is about small pieces
of data not mainly interconnected between each other, while Google is all
about links and all type of content: from little pieces of information
through whole Wikipedia.
You could start by studying how varnish and redis/memcached works, you
could study about how proxies work (nginx et al), CDNs and that kind of
stuff, but if you want more specific answers, you could better ask specific
question.
In the PHP area, an opcode cache does the job very well and can
accelerate the page load by several orders of magnitude, I recommend
OPCache, which is already included in PHP 5.5.
Greetings.
--
github.com/KingCrunch
Stuart Dallas
2013-09-18 12:06:14 UTC
Permalink
Post by Negin Nickparsa
to be more in details,suppose the website has 80,000 users and each page
takes 200 ms to be rendered and you have thousand hits in a second so we
want to reduce the time of rendering. is there any way to reduce the
rendering time?
other thing is suppose they want to upload files simultaneously and the
videos are in the website not on another server like YouTube and so streams
are really consuming the bandwidth.
Also,It is troublesome to get backups,when getting backups you have problem
of lock backing up with bulk of data.
Your question is impossible to answer efficiently without profiling. You need to know what PHP is doing in those 200ms before you can target your optimisations for maximum effect.

I use xdebug to produce trace files. From there I can see exactly what is taking the most amount of time, and then I can look in to how to make that thing faster. When I'm certain there is no faster way to do what it's doing I move on to the next biggest thing.

Of course there are generic things you should do such as adding an opcode cache and looking at your server setup, but targeted optimisation is far better than trying generic stuff.

-Stuart
--
Stuart Dallas
3ft9 Ltd
http://3ft9.com/
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
Camilo Sperberg
2013-09-18 13:09:12 UTC
Permalink
Post by Camilo Sperberg
I recommend OPCache, which is already included in PHP 5.5.
Camilo,
I'm just curious about the disadvantageous aspects of OPcache.
My logic says there must be some issues with it otherwise it would have come already enabled.
Sent from iPhone
Post by Camilo Sperberg
Post by Negin Nickparsa
Thank you Sebastian..actually I will already have one if qualified for the
job. Yes, and I may fail to handle it that's why I asked for guidance.
I wanted some tidbits to start over. I have searched through yslow,
HTTtrack and others.
I have searched through php list in my email too before asking this
question. it is kind of beneficial for all people and not has been asked
directly.
Sincerely
Negin Nickparsa
Post by Sebastian Krebs
Post by Negin Nickparsa
In general, what are the best ways to handle high traffic websites?
VPS(clouds)?
web analyzers?
dedicated servers?
distributed memory cache?
Yes :)
But seriously: That is a topic most of us spent much time to get into it.
You can explain it with a bunch of buzzwords. Additional, how do you define
"high traffic websites"? Do you already _have_ such a site? Or do you
_want_ it? It's important, because I've seen it far too often, that
projects spent too much effort in their "high traffic infrastructure" and
at the end it wasn't that high traffic ;) I wont say, that you cannot be
successfull, but you should start with an effort you can handle.
Regards,
Sebastian
Post by Negin Nickparsa
Sincerely
Negin Nickparsa
--
github.com/KingCrunch
Your question is way too vague to be answered properly... My best guess would be that it depends severely on the type of website you have and how's the current implementation being well... implemented.
Simply said: what works for Facebook may/will not work for linkedIn, twitter or Google, mainly because the type of search differs A LOT: facebook is about relations between people, twitter is about small pieces of data not mainly interconnected between each other, while Google is all about links and all type of content: from little pieces of information through whole Wikipedia.
You could start by studying how varnish and redis/memcached works, you could study about how proxies work (nginx et al), CDNs and that kind of stuff, but if you want more specific answers, you could better ask specific question.
In the PHP area, an opcode cache does the job very well and can accelerate the page load by several orders of magnitude, I recommend OPCache, which is already included in PHP 5.5.
Greetings.
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
The original RFC states:

https://wiki.php.net/rfc/optimizerplus
The integration proposed for PHP 5.5.0 is mostly 'soft' integration. That means that there'll be no tight coupling between Optimizer+ and PHP; Those who wish to use another opcode cache will be able to do so, by not loading Optimizer+ and loading another opcode cache instead. As per the Suggested Roadmap above, we might want to review this decision in the future; There might be room for further performance or functionality gains from tighter integration; None are known at this point, and they're beyond the scope of this RFC.

So that's why OPCache isn't enabled by default in PHP 5.5

Greetings.
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
Sebastian Krebs
2013-09-18 13:12:16 UTC
Permalink
Post by Camilo Sperberg
I recommend OPCache, which is already included in PHP 5.5.
Camilo,
I'm just curious about the disadvantageous aspects of OPcache.
My logic says there must be some issues with it otherwise it would have
come already enabled.
Sent from iPhone
Post by Camilo Sperberg
Post by Negin Nickparsa
Thank you Sebastian..actually I will already have one if qualified for
the
Post by Camilo Sperberg
Post by Negin Nickparsa
job. Yes, and I may fail to handle it that's why I asked for guidance.
I wanted some tidbits to start over. I have searched through yslow,
HTTtrack and others.
I have searched through php list in my email too before asking this
question. it is kind of beneficial for all people and not has been
asked
Post by Camilo Sperberg
Post by Negin Nickparsa
directly.
Sincerely
Negin Nickparsa
Post by Sebastian Krebs
Post by Negin Nickparsa
In general, what are the best ways to handle high traffic websites?
VPS(clouds)?
web analyzers?
dedicated servers?
distributed memory cache?
Yes :)
But seriously: That is a topic most of us spent much time to get into
it.
Post by Camilo Sperberg
Post by Negin Nickparsa
Post by Sebastian Krebs
You can explain it with a bunch of buzzwords. Additional, how do you
define
Post by Camilo Sperberg
Post by Negin Nickparsa
Post by Sebastian Krebs
"high traffic websites"? Do you already _have_ such a site? Or do you
_want_ it? It's important, because I've seen it far too often, that
projects spent too much effort in their "high traffic infrastructure"
and
Post by Camilo Sperberg
Post by Negin Nickparsa
Post by Sebastian Krebs
at the end it wasn't that high traffic ;) I wont say, that you cannot
be
Post by Camilo Sperberg
Post by Negin Nickparsa
Post by Sebastian Krebs
successfull, but you should start with an effort you can handle.
Regards,
Sebastian
Post by Negin Nickparsa
Sincerely
Negin Nickparsa
--
github.com/KingCrunch
Your question is way too vague to be answered properly... My best guess
would be that it depends severely on the type of website you have and how's
the current implementation being well... implemented.
Post by Camilo Sperberg
Simply said: what works for Facebook may/will not work for linkedIn,
facebook is about relations between people, twitter is about small pieces
of data not mainly interconnected between each other, while Google is all
about links and all type of content: from little pieces of information
through whole Wikipedia.
Post by Camilo Sperberg
You could start by studying how varnish and redis/memcached works, you
could study about how proxies work (nginx et al), CDNs and that kind of
stuff, but if you want more specific answers, you could better ask specific
question.
Post by Camilo Sperberg
In the PHP area, an opcode cache does the job very well and can
accelerate the page load by several orders of magnitude, I recommend
OPCache, which is already included in PHP 5.5.
Post by Camilo Sperberg
Greetings.
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
https://wiki.php.net/rfc/optimizerplus
The integration proposed for PHP 5.5.0 is mostly 'soft' integration. That
means that there'll be no tight coupling between Optimizer+ and PHP; Those
who wish to use another opcode cache will be able to do so, by not loading
Optimizer+ and loading another opcode cache instead. As per the Suggested
Roadmap above, we might want to review this decision in the future; There
might be room for further performance or functionality gains from tighter
integration; None are known at this point, and they're beyond the scope of
this RFC.
So that's why OPCache isn't enabled by default in PHP 5.5
Also worth to mention, that it is the first release with an opcode-cache
integrated. Giving the other some release to get used to it, sounds useful
:)
Greetings.
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
--
github.com/KingCrunch
Negin Nickparsa
2013-09-19 16:44:28 UTC
Permalink
it may be helpful for someone.
I liked GTmetrix kinda helpful and magic. <http://gtmetrix.com/#!>


Sincerely
Negin Nickparsa
Post by Negin Nickparsa
Post by Camilo Sperberg
I recommend OPCache, which is already included in PHP 5.5.
Camilo,
I'm just curious about the disadvantageous aspects of OPcache.
My logic says there must be some issues with it otherwise it would
have
come already enabled.
Sent from iPhone
Post by Camilo Sperberg
Post by Negin Nickparsa
Thank you Sebastian..actually I will already have one if qualified
for
the
Post by Camilo Sperberg
Post by Negin Nickparsa
job. Yes, and I may fail to handle it that's why I asked for
guidance.
Post by Camilo Sperberg
Post by Negin Nickparsa
I wanted some tidbits to start over. I have searched through yslow,
HTTtrack and others.
I have searched through php list in my email too before asking this
question. it is kind of beneficial for all people and not has been
asked
Post by Camilo Sperberg
Post by Negin Nickparsa
directly.
Sincerely
Negin Nickparsa
On Wed, Sep 18, 2013 at 10:45 AM, Sebastian Krebs <
Post by Sebastian Krebs
Post by Negin Nickparsa
In general, what are the best ways to handle high traffic websites?
VPS(clouds)?
web analyzers?
dedicated servers?
distributed memory cache?
Yes :)
But seriously: That is a topic most of us spent much time to get
into
it.
Post by Camilo Sperberg
Post by Negin Nickparsa
Post by Sebastian Krebs
You can explain it with a bunch of buzzwords. Additional, how do you
define
Post by Camilo Sperberg
Post by Negin Nickparsa
Post by Sebastian Krebs
"high traffic websites"? Do you already _have_ such a site? Or do
you
Post by Camilo Sperberg
Post by Negin Nickparsa
Post by Sebastian Krebs
_want_ it? It's important, because I've seen it far too often, that
projects spent too much effort in their "high traffic
infrastructure"
and
Post by Camilo Sperberg
Post by Negin Nickparsa
Post by Sebastian Krebs
at the end it wasn't that high traffic ;) I wont say, that you
cannot
be
Post by Camilo Sperberg
Post by Negin Nickparsa
Post by Sebastian Krebs
successfull, but you should start with an effort you can handle.
Regards,
Sebastian
Post by Negin Nickparsa
Sincerely
Negin Nickparsa
--
github.com/KingCrunch
Your question is way too vague to be answered properly... My best
guess
would be that it depends severely on the type of website you have and
how's
the current implementation being well... implemented.
Post by Camilo Sperberg
Simply said: what works for Facebook may/will not work for linkedIn,
facebook is about relations between people, twitter is about small pieces
of data not mainly interconnected between each other, while Google is all
about links and all type of content: from little pieces of information
through whole Wikipedia.
Post by Camilo Sperberg
You could start by studying how varnish and redis/memcached works, you
could study about how proxies work (nginx et al), CDNs and that kind of
stuff, but if you want more specific answers, you could better ask
specific
question.
Post by Camilo Sperberg
In the PHP area, an opcode cache does the job very well and can
accelerate the page load by several orders of magnitude, I recommend
OPCache, which is already included in PHP 5.5.
Post by Camilo Sperberg
Greetings.
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
https://wiki.php.net/rfc/optimizerplus
The integration proposed for PHP 5.5.0 is mostly 'soft' integration. That
means that there'll be no tight coupling between Optimizer+ and PHP;
Those
who wish to use another opcode cache will be able to do so, by not
loading
Optimizer+ and loading another opcode cache instead. As per the Suggested
Roadmap above, we might want to review this decision in the future; There
might be room for further performance or functionality gains from tighter
integration; None are known at this point, and they're beyond the scope
of
this RFC.
So that's why OPCache isn't enabled by default in PHP 5.5
Also worth to mention, that it is the first release with an opcode-cache
integrated. Giving the other some release to get used to it, sounds useful
:)
Greetings.
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
--
github.com/KingCrunch
Continue reading on narkive:
Loading...