运行多个并发Python程序访问同一个数据库表(Running multiple concurrent Python programs accessing the same database table)
Python中有什么允许您运行多个并发Python程序,这些程序可能会访问同一个数据库表并阻止每个程序使用完整的cpu,从而允许服务器剩下一些额外的容量吗?
Is there anything in Python that allows you to run multiple concurrent Python programs that could potentially access the same database table and to prevent each program from using the full cpu and thereby allow the server to have some additional capacity left over?
最满意答案
几个问题:
多个并发Python程序 - 请参阅http://wiki.python.org/moin/Concurrency ,我将尝试使用内置模块多处理( http://docs.python.org/2/library/multiprocessing.html ) 访问相同的数据库表 - 每个进程应该创建自己的数据库连接 - 之后并发由rdbms和/或连接/查询选项管理/或配置。 如果你真的需要在进程之间同步 - 使用Locks / Semaphores可能有所帮助。 防止每个程序使用完整的cpu - 这取决于你的进程应该做什么,我会去: 有一个主程序一直运行(主进程),暂停(time.sleep,gevent.sleep或类似)并生成和控制衍生进程(工作程序) 产生的进程完成工作(工作者) - 打开新连接,执行数据库操作并退出我确信多处理(或其他)模块提供的一些工作流程/系统可以满足您的需求(工作人员,池,队列,管道,共享状态,同步......)。
Several issues:
multiple concurrent Python programs - see http://wiki.python.org/moin/Concurrency, for the start I would try with builtin module multiprocessing (http://docs.python.org/2/library/multiprocessing.html) access the same database table - each process should create own db connection - after that concurrency is managed by/or configured within rdbms and/or connection/query options. If you really need to sync between processes - using Locks/Semaphores could help. prevent each program from using the full cpu - it depends what your processes should do, I would go with: having one main program that runs all the time (master process), does pauses (time.sleep, gevent.sleep or similar) and spawns and controls spawned processes (workers) spawned processes do the job (workers) - open new connection, do db actions and quitI'm sure that some of the workflows/systems provided by multiprocessing (or other) modules could fit your needs (Workers, Pools, Queues, Pipes, Shared states, Synchronization, ...).
更多推荐
发布评论