Python论坛  - 讨论区

标题:[python-chinese] Fwd: [webpy] Nginx + WSGI == Blazing Fast!

2008年01月08日 星期二 08:50

Zoom.Quiet zoom.quiet在gmail.com
星期二 一月 8 08:50:28 HKT 2008

Here is the configuration files and the code used for both tests.

--------------------------------------------
Configuration file for Lighttpd
--------------------------------------------
server.modules = (
   "mod_rewrite",
   "mod_fastcgi",
)
server.document-root       = "/var/www/"
server.errorlog            = "/var/log/lighttpd/error.log"
server.pid-file            = "/var/run/lighttpd.pid"

## virtual directory listings
dir-listing.encoding        = "utf-8"
server.dir-listing          = "enable"
server.username            = "www-data"
server.groupname           = "www-data"

fastcgi.server = ( "/code-fastcgi.py" =>
(( "socket" => "/tmp/fastcgi.socket",
  "bin-path" => "/var/www/code-fastcgi.py",
  "max-procs" => 1
))
)

url.rewrite-once = (
 "^/favicon.ico$" => "/static/favicon.ico",
 "^/static/(.*)$" => "/static/$1",
 "^/(.*)$" => "/code-fastcgi.py/$1",
)

--------------------------------------------
Configuration file for Nginx
--------------------------------------------
worker_processes  2;
error_log  logs/error.log info;
pid        logs/nginx.pid;

events {
   worker_connections  1024;
}


env HOME;
env PYTHONPATH=/usr/bin/python;

http {
   include       conf/mime.types;
   default_type  application/octet-stream;

   sendfile        on;
   keepalive_timeout  65;

   wsgi_python_optimize 2;
   wsgi_python_executable /usr/bin/python;
   wsgi_python_home /usr/;
   wsgi_enable_subinterpreters on;

   server {
       listen       80;
       server_name  localhost;


       include conf/wsgi_vars;

       location / {
           #client_body_buffer_size 50;
           wsgi_pass /usr/local/nginx/nginx.py;

           wsgi_pass_authorization off;
           wsgi_script_reloading on;
           wsgi_use_main_interpreter on;
       }

       location /wsgi {
           #client_body_buffer_size 50;
           wsgi_var TEST test;
           wsgi_var FOO bar;
           wsgi_var EMPTY "";
           # override existing HTTP_ variables
           wsgi_var HTTP_USER_AGENT "nginx";
           wsgi_var HTTP_COOKIE  $http_cookie;

           wsgi_pass /usr/local/nginx/nginx-2.py  main;

           wsgi_pass_authorization on;
           wsgi_script_reloading off;
           wsgi_use_main_interpreter off;
       }

       location /wsgi-webpy {
          wsgi_pass /usr/local/nginx/webpy-code.py;
       }
   }
}

--------------------------------------------
Code for Lighttpd
--------------------------------------------
#!/usr/bin/env python

import web

urls = (
   '/(.*)', 'hello'
)

class hello:
   def GET(self, name):
       i = web.input(times=1)
       if not name: name = 'world'
       for c in xrange(int(i.times)):
           print 'Hello,', name+'!'

if __name__ == "__main__": web.run(urls, globals())

--------------------------------------------
Code for Nginx
--------------------------------------------
import web

urls = (
   '/(.*)', 'hello'
)

class hello:
   def GET(self, name):
       i = web.input(times=1)
       if not name: name = 'world'
       for c in xrange(int(i.times)): print 'Hello,', name+'!'

application = web.wsgifunc(web.webpyfunc(urls, globals()))


---------- Forwarded message ----------
From: David Cancel <dcancel在gmail.com>
Date: Jan 7, 2008 11:55 PM
Subject: [webpy] Nginx + WSGI == Blazing Fast!
To: "web.py" <webpy在googlegroups.com>



I know.. I know... Simple benchmarks mean nothing but I couldn't help
playing with the new(ish) mod_wsgi module for my favorite webserver
Nginx.

Nginx: http://nginx.net/
Nginx mod_wsgi module: http://wiki.codemongers.com/NginxNgxWSGIModule

I tested Nginx vs. the recommended setup of Lighttpd/Fastcgi. These
very simple and flawed tests were run on Debian Etch running under
virtualization (Parallels) on my Macbook Pro. Hey I said they were
flawed.. :-)

The results show Nginx/WSGI performing 3x as fast as Lighttpd/Fastcgi,
over 1000 requests per second!!

I tested both with Keep-Alives on and off. I'm not sure why Nginx/WSGI
performed 2x as fast with keep-alives on.

*********** Full results below *************

--------------------------------------------
Nginx 0.5.34 - Keepalives On
---------------------------------------------
ab -c 10 -n 1000 -k http://10.211.55.4/wsgi-webpy/david
This is ApacheBench, Version 1.3d <$Revision: 1.73 $> apache-1.3

Server Software:        nginx/
0.5.34
Server Hostname:        10.211.55.4
Server Port:            80

Document Path:          /wsgi-webpy/david
Document Length:        14 bytes

Concurrency Level:      10
Time taken for tests:   0.970 seconds
Complete requests:      1000
Failed requests:        0
Broken pipe errors:     0
Keep-Alive requests:    1001
Total transferred:      136136 bytes
HTML transferred:       14014 bytes
** Requests per second:    1030.93 [#/sec] (mean) **
Time per request:       9.70 [ms] (mean)
Time per request:       0.97 [ms] (mean, across all concurrent
requests)
Transfer rate:          140.35 [Kbytes/sec] received

Connnection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0     0    0.4      0     5
Processing:     1     9    4.3      9    26
Waiting:        0     9    4.2      9    25
Total:          1     9    4.3      9    26

Percentage of the requests served within a certain time (ms)
  50%      9
  66%     11
  75%     12
  80%     13
  90%     15
  95%     17
  98%     20
  99%     22
 100%     26 (last request)

--------------------------------------------
Nginx 0.5.34 - No Keepalives
---------------------------------------------
ab -c 10 -n 1000  http://10.211.55.4/wsgi-webpy/david
This is ApacheBench, Version 1.3d <$Revision: 1.73 $> apache-1.3

Server Software:        nginx/
0.5.34
Server Hostname:        10.211.55.4
Server Port:            80

Document Path:          /wsgi-webpy/david
Document Length:        14 bytes

Concurrency Level:      10
Time taken for tests:   2.378 seconds
Complete requests:      1000
Failed requests:        0
Broken pipe errors:     0
Total transferred:      131131 bytes
HTML transferred:       14014 bytes
** Requests per second:    420.52 [#/sec] (mean) **
Time per request:       23.78 [ms] (mean)
Time per request:       2.38 [ms] (mean, across all concurrent
requests)
Transfer rate:          55.14 [Kbytes/sec] received

Connnection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0     4    2.9      3    26
Processing:     8    19    8.8     18   136
Waiting:        0    19    8.8     17   135
Total:          8    23    8.9     21   142

Percentage of the requests served within a certain time (ms)
  50%     21
  66%     24
  75%     26
  80%     28
  90%     34
  95%     40
  98%     45
  99%     47
 100%    142 (last request)

*********************************************************************

--------------------------------------------
Lighttpd 1.4.13 - Keepalives On
---------------------------------------------
ab -c 10 -n 1000 -k http://10.211.55.4/david
This is ApacheBench, Version 1.3d <$Revision: 1.73 $> apache-1.3

Server Software:        lighttpd/
1.4.13
Server Hostname:        10.211.55.4
Server Port:            80

Document Path:          /david
Document Length:        14 bytes

Concurrency Level:      10
Time taken for tests:   2.901 seconds
Complete requests:      1000
Failed requests:        1
   (Connect: 0, Length: 1, Exceptions: 0)
Broken pipe errors:     0
Keep-Alive requests:    942
Total transferred:      138711 bytes
HTML transferred:       14001 bytes
** Requests per second:    344.71 [#/sec] (mean) **
Time per request:       29.01 [ms] (mean)
Time per request:       2.90 [ms] (mean, across all concurrent
requests)
Transfer rate:          47.81 [Kbytes/sec] received

Connnection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0     0    1.1      0    21
Processing:     3    28   29.3     22   385
Waiting:        3    28   29.3     22   385
Total:          3    28   29.3     22   385

Percentage of the requests served within a certain time (ms)
  50%     22
  66%     26
  75%     31
  80%     34
  90%     48
  95%     60
  98%    100
  99%    164
 100%    385 (last request)

--------------------------------------------
Lighttpd 1.4.13 - No Keepalives
---------------------------------------------
ab -c 10 -n 1000 http://10.211.55.4/david
This is ApacheBench, Version 1.3d <$Revision: 1.73 $> apache-1.3

Server Software:        lighttpd/
1.4.13
Server Hostname:        10.211.55.4
Server Port:            80

Document Path:          /david
Document Length:        14 bytes

Concurrency Level:      10
Time taken for tests:   4.017 seconds
Complete requests:      1000
Failed requests:        1
   (Connect: 0, Length: 1, Exceptions: 0)
Broken pipe errors:     0
Total transferred:      134269 bytes
HTML transferred:       14029 bytes
** Requests per second:    248.94 [#/sec] (mean) **
Time per request:       40.17 [ms] (mean)
Time per request:       4.02 [ms] (mean, across all concurrent
requests)
Transfer rate:          33.43 [Kbytes/sec] received

Connnection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0     3    4.9      2    68
Processing:     3    36   49.6     28   852
Waiting:        2    35   49.6     28   852
Total:          3    39   50.1     30   855

Percentage of the requests served within a certain time (ms)
  50%     30
  66%     36
  75%     41
  80%     44
  90%     61
  95%     87
  98%    148
  99%    252
 100%    855 (last request)
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google
Groups "web.py" group.
To post to this group, send email to webpy在googlegroups.com
To unsubscribe from this group, send email to webpy-unsubscribe在googlegroups.com
For more options, visit this group at http://groups.google.com/group/webpy?hl=en
-~----------~----~----~----~------~----~------~--~---




-- 
'''Time is unimportant, only life important!
过程改进乃是开始催生可促生靠谱的人的组织!
'''http://zoomquiet.org
博 @ http://blog.zoomquiet.org/pyblosxom/
维 @ http://wiki.woodpecker.org.cn/moin/ZoomQuiet
豆 @ http://www.douban.com/people/zoomq/
看 @ http://zoomq.haokanbu.com/
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Pls. usage OOo to replace M$ Office. http://zh.openoffice.org
Pls. usage 7-zip to replace WinRAR/WinZip.  http://7-zip.org
You can get the truely Freedom 4 software.

[导入自Mailman归档:http://www.zeuux.org/pipermail/zeuux-python]

2008年01月14日 星期一 13:19

? ? asmasters在gmail.com
星期一 一月 14 13:19:43 HKT 2008

AMAZING .....VERRRRY STRONG

2008/1/8, Zoom. Quiet <zoom.quiet在gmail.com>:
>
> Here is the configuration files and the code used for both tests.
>
> --------------------------------------------
> Configuration file for Lighttpd
> --------------------------------------------
> server.modules = (
>   "mod_rewrite",
>   "mod_fastcgi",
> )
> server.document-root       = "/var/www/"
> server.errorlog            = "/var/log/lighttpd/error.log"
> server.pid-file            = "/var/run/lighttpd.pid"
>
> ## virtual directory listings
> dir-listing.encoding        = "utf-8"
> server.dir-listing          = "enable"
> server.username            = "www-data"
> server.groupname           = "www-data"
>
> fastcgi.server = ( "/code-fastcgi.py" =>
> (( "socket" => "/tmp/fastcgi.socket",
> "bin-path" => "/var/www/code-fastcgi.py",
> "max-procs" => 1
> ))
> )
>
> url.rewrite-once = (
> "^/favicon.ico$" => "/static/favicon.ico",
> "^/static/(.*)$" => "/static/$1",
> "^/(.*)$" => "/code-fastcgi.py/$1",
> )
>
> --------------------------------------------
> Configuration file for Nginx
> --------------------------------------------
> worker_processes  2;
> error_log  logs/error.log info;
> pid        logs/nginx.pid;
>
> events {
>   worker_connections  1024;
> }
>
>
> env HOME;
> env PYTHONPATH=/usr/bin/python;
>
> http {
>   include       conf/mime.types;
>   default_type  application/octet-stream;
>
>   sendfile        on;
>   keepalive_timeout  65;
>
>   wsgi_python_optimize 2;
>   wsgi_python_executable /usr/bin/python;
>   wsgi_python_home /usr/;
>   wsgi_enable_subinterpreters on;
>
>   server {
>       listen       80;
>       server_name  localhost;
>
>
>       include conf/wsgi_vars;
>
>       location / {
>           #client_body_buffer_size 50;
>           wsgi_pass /usr/local/nginx/nginx.py;
>
>           wsgi_pass_authorization off;
>           wsgi_script_reloading on;
>           wsgi_use_main_interpreter on;
>       }
>
>       location /wsgi {
>           #client_body_buffer_size 50;
>           wsgi_var TEST test;
>           wsgi_var FOO bar;
>           wsgi_var EMPTY "";
>           # override existing HTTP_ variables
>           wsgi_var HTTP_USER_AGENT "nginx";
>           wsgi_var HTTP_COOKIE  $http_cookie;
>
>           wsgi_pass /usr/local/nginx/nginx-2.py  main;
>
>           wsgi_pass_authorization on;
>           wsgi_script_reloading off;
>           wsgi_use_main_interpreter off;
>       }
>
>       location /wsgi-webpy {
>          wsgi_pass /usr/local/nginx/webpy-code.py;
>       }
>   }
> }
>
> --------------------------------------------
> Code for Lighttpd
> --------------------------------------------
> #!/usr/bin/env python
>
> import web
>
> urls = (
>   '/(.*)', 'hello'
> )
>
> class hello:
>   def GET(self, name):
>       i = web.input(times=1)
>       if not name: name = 'world'
>       for c in xrange(int(i.times)):
>           print 'Hello,', name+'!'
>
> if __name__ == "__main__": web.run(urls, globals())
>
> --------------------------------------------
> Code for Nginx
> --------------------------------------------
> import web
>
> urls = (
>   '/(.*)', 'hello'
> )
>
> class hello:
>   def GET(self, name):
>       i = web.input(times=1)
>       if not name: name = 'world'
>       for c in xrange(int(i.times)): print 'Hello,', name+'!'
>
> application = web.wsgifunc(web.webpyfunc(urls, globals()))
>
>
> ---------- Forwarded message ----------
> From: David Cancel <dcancel在gmail.com>
> Date: Jan 7, 2008 11:55 PM
> Subject: [webpy] Nginx + WSGI == Blazing Fast!
> To: "web.py" <webpy在googlegroups.com>
>
>
>
> I know.. I know... Simple benchmarks mean nothing but I couldn't help
> playing with the new(ish) mod_wsgi module for my favorite webserver
> Nginx.
>
> Nginx: http://nginx.net/
> Nginx mod_wsgi module: http://wiki.codemongers.com/NginxNgxWSGIModule
>
> I tested Nginx vs. the recommended setup of Lighttpd/Fastcgi. These
> very simple and flawed tests were run on Debian Etch running under
> virtualization (Parallels) on my Macbook Pro. Hey I said they were
> flawed.. :-)
>
> The results show Nginx/WSGI performing 3x as fast as Lighttpd/Fastcgi,
> over 1000 requests per second!!
>
> I tested both with Keep-Alives on and off. I'm not sure why Nginx/WSGI
> performed 2x as fast with keep-alives on.
>
> *********** Full results below *************
>
> --------------------------------------------
> Nginx 0.5.34 - Keepalives On
> ---------------------------------------------
> ab -c 10 -n 1000 -k http://10.211.55.4/wsgi-webpy/david
> This is ApacheBench, Version 1.3d <$Revision: 1.73 $> apache-1.3
>
> Server Software:        nginx/
> 0.5.34
> Server Hostname:        10.211.55.4
> Server Port:            80
>
> Document Path:          /wsgi-webpy/david
> Document Length:        14 bytes
>
> Concurrency Level:      10
> Time taken for tests:   0.970 seconds
> Complete requests:      1000
> Failed requests:        0
> Broken pipe errors:     0
> Keep-Alive requests:    1001
> Total transferred:      136136 bytes
> HTML transferred:       14014 bytes
> ** Requests per second:    1030.93 [#/sec] (mean) **
> Time per request:       9.70 [ms] (mean)
> Time per request:       0.97 [ms] (mean, across all concurrent
> requests)
> Transfer rate:          140.35 [Kbytes/sec] received
>
> Connnection Times (ms)
>              min  mean[+/-sd] median   max
> Connect:        0     0    0.4      0     5
> Processing:     1     9    4.3      9    26
> Waiting:        0     9    4.2      9    25
> Total:          1     9    4.3      9    26
>
> Percentage of the requests served within a certain time (ms)
> 50%      9
> 66%     11
> 75%     12
> 80%     13
> 90%     15
> 95%     17
> 98%     20
> 99%     22
> 100%     26 (last request)
>
> --------------------------------------------
> Nginx 0.5.34 - No Keepalives
> ---------------------------------------------
> ab -c 10 -n 1000  http://10.211.55.4/wsgi-webpy/david
> This is ApacheBench, Version 1.3d <$Revision: 1.73 $> apache-1.3
>
> Server Software:        nginx/
> 0.5.34
> Server Hostname:        10.211.55.4
> Server Port:            80
>
> Document Path:          /wsgi-webpy/david
> Document Length:        14 bytes
>
> Concurrency Level:      10
> Time taken for tests:   2.378 seconds
> Complete requests:      1000
> Failed requests:        0
> Broken pipe errors:     0
> Total transferred:      131131 bytes
> HTML transferred:       14014 bytes
> ** Requests per second:    420.52 [#/sec] (mean) **
> Time per request:       23.78 [ms] (mean)
> Time per request:       2.38 [ms] (mean, across all concurrent
> requests)
> Transfer rate:          55.14 [Kbytes/sec] received
>
> Connnection Times (ms)
>              min  mean[+/-sd] median   max
> Connect:        0     4    2.9      3    26
> Processing:     8    19    8.8     18   136
> Waiting:        0    19    8.8     17   135
> Total:          8    23    8.9     21   142
>
> Percentage of the requests served within a certain time (ms)
> 50%     21
> 66%     24
> 75%     26
> 80%     28
> 90%     34
> 95%     40
> 98%     45
> 99%     47
> 100%    142 (last request)
>
> *********************************************************************
>
> --------------------------------------------
> Lighttpd 1.4.13 - Keepalives On
> ---------------------------------------------
> ab -c 10 -n 1000 -k http://10.211.55.4/david
> This is ApacheBench, Version 1.3d <$Revision: 1.73 $> apache-1.3
>
> Server Software:        lighttpd/
> 1.4.13
> Server Hostname:        10.211.55.4
> Server Port:            80
>
> Document Path:          /david
> Document Length:        14 bytes
>
> Concurrency Level:      10
> Time taken for tests:   2.901 seconds
> Complete requests:      1000
> Failed requests:        1
>   (Connect: 0, Length: 1, Exceptions: 0)
> Broken pipe errors:     0
> Keep-Alive requests:    942
> Total transferred:      138711 bytes
> HTML transferred:       14001 bytes
> ** Requests per second:    344.71 [#/sec] (mean) **
> Time per request:       29.01 [ms] (mean)
> Time per request:       2.90 [ms] (mean, across all concurrent
> requests)
> Transfer rate:          47.81 [Kbytes/sec] received
>
> Connnection Times (ms)
>              min  mean[+/-sd] median   max
> Connect:        0     0    1.1      0    21
> Processing:     3    28   29.3     22   385
> Waiting:        3    28   29.3     22   385
> Total:          3    28   29.3     22   385
>
> Percentage of the requests served within a certain time (ms)
> 50%     22
> 66%     26
> 75%     31
> 80%     34
> 90%     48
> 95%     60
> 98%    100
> 99%    164
> 100%    385 (last request)
>
> --------------------------------------------
> Lighttpd 1.4.13 - No Keepalives
> ---------------------------------------------
> ab -c 10 -n 1000 http://10.211.55.4/david
> This is ApacheBench, Version 1.3d <$Revision: 1.73 $> apache-1.3
>
> Server Software:        lighttpd/
> 1.4.13
> Server Hostname:        10.211.55.4
> Server Port:            80
>
> Document Path:          /david
> Document Length:        14 bytes
>
> Concurrency Level:      10
> Time taken for tests:   4.017 seconds
> Complete requests:      1000
> Failed requests:        1
>   (Connect: 0, Length: 1, Exceptions: 0)
> Broken pipe errors:     0
> Total transferred:      134269 bytes
> HTML transferred:       14029 bytes
> ** Requests per second:    248.94 [#/sec] (mean) **
> Time per request:       40.17 [ms] (mean)
> Time per request:       4.02 [ms] (mean, across all concurrent
> requests)
> Transfer rate:          33.43 [Kbytes/sec] received
>
> Connnection Times (ms)
>              min  mean[+/-sd] median   max
> Connect:        0     3    4.9      2    68
> Processing:     3    36   49.6     28   852
> Waiting:        2    35   49.6     28   852
> Total:          3    39   50.1     30   855
>
> Percentage of the requests served within a certain time (ms)
> 50%     30
> 66%     36
> 75%     41
> 80%     44
> 90%     61
> 95%     87
> 98%    148
> 99%    252
> 100%    855 (last request)
> --~--~---------~--~----~------------~-------~--~----~
> You received this message because you are subscribed to the Google
> Groups "web.py" group.
> To post to this group, send email to webpy在googlegroups.com
> To unsubscribe from this group, send email to
> webpy-unsubscribe在googlegroups.com
> For more options, visit this group at
> http://groups.google.com/group/webpy?hl=en
> -~----------~----~----~----~------~----~------~--~---
>
>
>
>
> --
> '''Time is unimportant, only life important!
> 过程改进乃是开始催生可促生靠谱的人的组织!
> '''http://zoomquiet.org
> 博 @ http://blog.zoomquiet.org/pyblosxom/
> 维 @ http://wiki.woodpecker.org.cn/moin/ZoomQuiet
> 豆 @ http://www.douban.com/people/zoomq/
> 看 @ http://zoomq.haokanbu.com/
> ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> Pls. usage OOo to replace M$ Office. http://zh.openoffice.org
> Pls. usage 7-zip to replace WinRAR/WinZip.  http://7-zip.org
> You can get the truely Freedom 4 software.
> _______________________________________________
> python-chinese
> Post: send python-chinese在lists.python.cn
> Subscribe: send subscribe to python-chinese-request在lists.python.cn
> Unsubscribe: send unsubscribe to  python-chinese-request在lists.python.cn
> Detail Info: http://python.cn/mailman/listinfo/python-chinese
-------------- 下一部分 --------------
一个HTML附件被移除...
URL: http://python.cn/pipermail/python-chinese/attachments/20080114/a1adaceb/attachment-0001.html 

[导入自Mailman归档:http://www.zeuux.org/pipermail/zeuux-python]

如下红色区域有误,请重新填写。

    你的回复:

    请 登录 后回复。还没有在Zeuux哲思注册吗?现在 注册 !

    Zeuux © 2024

    京ICP备05028076号