PHP应用程序到桌面应用程序

PHP应用程序到桌面应用程序

问题描述:

I have developed a PHP application for my company, the problem is that I have just discovered that the application must work offline too. The application works like this: some data is loaded from a MySQL Database, then you have to compile some checklists, insert new data in the database and, in the end, generate a JSON.

The application will be used by a lot of people of our company, so we thought about installing on their computers a webserver (Apache) and make the application run on their machines. The problem is that, if we decide to go this way, we have to:

  • Download all the data from MySQL BEFORE starting the application (when the user has access to internet) and save this data into a JSON file
    • Change all the queries in the project in order to take the data from the JSON instead of the database
    • Also, there are a lot of functions which insert data into the database in real-time, so we would have to use SQLite and then transfer the data to the MySQL Database
    • Finally, in this way, the people who use this program would have access to ALL PHP files, and they could modify them at any time.

We don't have the time to think about a real Desktop Java application because this app will be used starting from January, so we don't have the time to develop it.

Have you got any suggestions? Is there something I'm not thinking about, or a technology which could help me? Thank you!

PS. I have considered programs like Nightrain of PHP Desktop but they only avoid the installation of Apache, nothing more...

我为我的公司开发了一个PHP应用程序,问题是我刚刚发现应用程序必须脱机工作 应用程序的工作原理如下:从MySQL数据库加载一些数据,然后你必须编译一些清单,在数据库中插入新数据,最后生成一个JSON。 p>

该应用程序将被我们公司的许多人使用,因此我们考虑在他们的计算机上安装Web服务器(Apache)并使应用程序在他们的机器上运行。 问题是,如果我们决定采用这种方式,我们必须: p>

  • 在启动应用程序之前从MySQL下载所有数据 (当用户有 访问互联网)并将此数据保存到JSON 文件
    • 更改项目中的所有查询,以便从JSON而不是数据库中获取数据 li> \ n
    • 另外,有很多函数可以实时地将数据插入到数据库中,因此我们必须使用SQLite然后将数据传输到MySQL数据库 li>
    • 最后 通过这种方式,使用该程序的人可以访问所有PHP文件,他们可以随时修改它们。 li> ul> li> ul>

      我们没有时间考虑真正的桌面Java应用程序,因为这个应用程序将从1月开始使用,因此我们没有时间进行开发。 p>

      你有什么建议吗? 有什么我不会考虑的,或者是一种可以帮助我的技术吗? 谢谢你! p>

      PS。 我已经考虑过像PHP桌面的Nightrain这样的程序,但它们只避免安装Apache,仅此而已...... p> div>

Introduction

Since you obviously need a fast solution, I'll give you one. This is based on the pieces of information we know. Warning, this solution is not elegant, and you WILL NEED to replace it when you get the chance.

Solution

  1. Clear all of your primary and foreign keys.
  2. Replace them with BINARY(16) with an index.

Every record will need its pseudo-primary-key to be randomly generated with a CSRNG, Binary 16 is just convenient to follow the UUID standard. This will ensure each new record remains uniquely indexed despite lack of knowledge of the other distributions.

You're tables won't have primary key indexes, because these are unique, and since the database will be distributed, it won't be possible to check the uniqueness of the keys anyway, so there is no point using it.

  1. Each laptop will need a copy of the entire database.
  2. Each laptop will only be allowed to add new data, never delete or modify base data.

In fact, as a rule, all data on the central database will be write-once/read-only from now on. It doesn't matter how erroneous the newly merged data is, it must never be deleted or modified.

  1. New data should be regarded as "updates" based on their timestamp.

So every table will need a timestamp.

  1. Finally a record of when a copy distribution was made should be kept to retain knowledge of which data to merge back to the central database.

What you are left with, is a central database that takes on all data, and changes to data will be represented by the presence of newer data.

Conclusion

I'd only use this solution if I really had too. In fact, I'd estimate only an 80% chance of it even working with sub-standard quality. It also assumes that you can devote all remaining development time to the re-factoring of data insertion methods.

You are going to have to deal with the fact that a LOT of administration work will be needed on the central database to manage the integrity of the data, and you will need to work with the assumption that you can't change the format of the input being merged from the laptops.

Every new feature will need to be backwards compatible with old data.