How do I solve labor scheduling problem based on different constraints? - resource-scheduling

Schedule labors for the given task, based on different constraints.
Constraints examples are labor Skills, time, locations, shifts, holidays, priorities, capacity, etc.
Problem description:
Let us say there is a task to set up the Linux server in California on 25th Dec and to set up the server, it takes 5 hours of time. To set up a Linus server we would need labor who is skilled with the Linux server and stays in California and should be free on 25th Dec from 9 am - 3 pm. total six hours.
So if there is a pool of labor resources and if I have to find suitable labors for multiple tasks, What approach should I follow?
I google and found that this is a constraint-based programming problem and google OR Tools and other models provide a way to solve it.
So I started looking into Google OR Tools Google OR Tools doc.
The documentation provides basic examples.
I found another doc on git hub for google OR which better than the above link.
Git hub doc for google OR Tools
I tried implementing the nurse scheduling program which given here Google OR nurse scheduling
I am having difficulty in understanding the program. It is not that I am having difficulty in understanding python or java.
So my questions are.
What are the way of solving such a problem?
Is Google OR is the right tool? If yes, What are the prerequisite for google OR if somebody has weak maths background?
How should I proceed to solve such a problem?

Related

Which is easier to learn - Google Cloud or AWS, with the main use case being Redshif/BigQuery + small data pipelines.

Looking at building a small-data warehouse, so don't need much scaling - just flexibility & ease of building ETL pipelines and minimal maintenance.
Currently we are looking at a relatively simple architecture with Google App Engine handling the ETL into BigQuery, but I'm wondering if RedShift + EC2 would be easier to learn? We haven't worked with either cloud as a team.
This is not a good question for StackOverflow as it can only be answered with opinions.
However, I will try to help as I know the four major clouds very well (AWS, Alibaba, Azure, Google) with 12 years experience with AWS, 10 with Azure, 8 with Google (off/on again), 1 year Alibaba.
Which cloud is earier to learn?
How much experience in IT do you have? If you have a lot, which technologies (Linux / Open Source or Microsoft enterprise)?
Each of the clouds are excellent. Each one has very good services. Nobody is so far ahead of anyone else to really make a difference in respect to technology.
AWS is considered the market leader and tends to move faster at this time than everyone else. This also means that you will spend time every single day reading AWS blogs, product announcements, feature updates, etc. to stay current.
For your use case, any of the cloud vendors will serve you well. It is only when you have large infrastructures or very specific technical needs, that maybe one cloud is better. It really comes down to which one you prefer (or like), which one has services that you are more familiar with, which one has the support policies that you need, etc.
Another item. There is a huge race between these vendors to be the best. For any given service - A might be better today (or easier to use or have better tools) but tomorrow B is now ahead. Wait two quarters and C is now ahead. I expect that this will continue for another two years or so until everyone matures with their technology, services and support.
The days of only chosing one cloud vendor for everything is over. Today, I typically design three way hybrid environments (two cloud vendors and on-site data centers).

Combinatorial Optimization problems on Docker or AWS

I am attacking a combinatorial optimization problem similar to the multi-knapsack problem. The problem has an optimal solution, and i prefer not to settle for an approximate solution.
Are there any recommended tutorials regarding the quick prototyping and deployment of combinatorial optimization solutions (for senior software engineers that are also Big Data newbies)? I want to move quickly from prototype to deployment onto a docker cluster or AWS.
My background is in distributed systems (a focus on .NET, java, kafka, docker containers, etc...), thus I'm typically inclined to solve complex problems by parallel processing across a cluster of machines (via scaling on a docker cluster or AWS). However, this particular problem can NOT be solved in a brute force manner as the problem space is too large (roughly 100^1000 combinations are possible).
I've limited experience with “big data”, but I'm studying up on knapsack solvers, genetic algorithms, reinforcement learning, and some other AI/ML approaches. Given my limited exposure in this area, how would one recommend I tackle a problem such as this?
I tend to favor the approach of leveraging existing frameworks/libraries as much as possible. Good idea? Or would one recommend using Accord.Net or ML.Net or some other library to build a custom model?
If existing frameworks are the way to go, any particular favorites? tensorflow? Any thoughts on Google OR tools: https://developers.google.com/optimization/ Anything in the AWS space?
Any good tutorials, videos, or podcasts that can get me prototyping quickly? (keeping in mind my goal of deploying and validating the model on a docker cluster)
Thank you for any help and guidance!
The Cloud Balancing problem in OptaPlanner (open source, java) is a multi-knapsack problem. There's a tutorial for it in the user guide. Many users run OptaPlanner implementations on Docker (normal open JDK 8 image) and AWS. Here's an Employee Rostering implementation that is deployed to OpenShift Dedicated (which generates an docker image that it runs on AWS) - it exposes a REST api (which is Swagger documented even).
Thanks to all for your insight above. I’m having a look at optaplanner and google-OT, as well as a few other solvers.
To follow up on this question, if I were to relax the constraint that I want the optimal answer , and allow for “approximate” solutions , would this change your guidance or recommended tool set (libraries/frameworks) in any way?

Systems architecture for small business ISP

I'm the only programmer of a pretty small ISP in a rural area with just around 2000 customers. Now I have finished a couple of semesters in university but I only have a couple of years of experience in the field so I'm uncertain of the architectural decisions that I'm making and was hoping somebody could help me pick the right path.
Most of our internal apps were created 8-10 years ago and are severely outdated and I have been given the job to replace those systems. Most of the basic underlying systems are solid but the apps that we use to manage our customers and connecting those to our internal systems are...lacking to say the least.
Most of these applications were created in PHP back in the day and are using mysql databases. I decided that i was going to create a couple of rest APIs using NodeJS on top of these databases and then create a central app that will take care of connecting all those systems together and making sure they stay up to date with one another.
Now for the question. I've been looking a bit into enterprise architecture and from what I've gathered going with this sort of micro service architecture seems to be a solid plan. However I've also seen a couple articles talking about message buses and my question is if i should instead set up a message bus, for example apache activemq so these services can talk together amongst themselves instead of using a central app that would handle managing all of them.
Are there any specific patterns that i should be reading up on or does what I've come up with look solid enough?
An enterprise service bus will add a lot of complexity to your design, so you need to look at the pro/con to see if it's really necessary. Here is an article you can always upgrade your architecture in the future and migrate the services.
I run some complex services on Apache Tomcat and they work great. Supports a user pool of 70,000. If you build in connection pooling and redundancy you should be fine.

As an experiment I want to work a bit with AWS. How much might I expect to pay?

I'm about to go to Pycon, and while I have my hosting at Webfaction one of the tutorials (JKM) asks for students to have AWS instances. I've been trying to figure out what some minimum charge examples might look like? I'll have a lamp server with Django and a requisite amount of storage but next to no traffic.
Anyone have some guidance/advice? My Google searches and look here did not turn up much useful info.
It depends on how long you need to run your instance. A small linux instance will cost 8.5 cents per hour. If you spend a week at Pycon and have your instance running the entire week, it would cost $14.28 for the week. You probably won't need it while you are asleep, so you can turn it off when you are done each day. If you only need it for an hour it will cost you 8.5 cents.
Here's more details on the pricing if you need a bigger server or you need a windows server instead:
http://aws.amazon.com/ec2/#pricing
I think the AWS calculator might help also for estimating cost.
See http://calculator.s3.amazonaws.com/calc5.html
Also try here for a comparison of various different on-demand services (plus rough calculations of how much it would cost to roll it yourself): https://secure.slicify.com/Calculator.aspx
(full disclosure - it's a page on my site).

Statistics based marketing campaign measurement tools

Currently using SAS as measurement engine and Business Objects as display layer. Looking to develop a new, faster, slicker solution. Has anyone developed or purchased a campaign measurement reporting system? This solution should measure everything from email stats, web stats, customer activity, lift, ROI, etc.
Ok.. I'm researching and finding nada... We are working with a team from India and they want to re-write everything from scratch.. Any solutions out there at all?
If you are already using SAS, have you looked at their Marketing Automation software?
Update:
Just saw a press release from SAS about a new "Software as a Service" Campaign Management solution. Might be worth checking out for this.
When I was a consultant, we either rolled our own or used SAS (or a combination of the two).
Another vote for roll your own, it's mad that this area is so under served. The expense of building your own solution from the ground up, and the hassle of managing a remote team makes me think you may get further by integrating some existing tools.
Google Analytics for web usage has an API, there are many web log tools, you then need to bolt in the customer figures from your end of things.
I really doubt you could do much better than SAS in this area. Especially if you pick up some of thier specialist packages.
You could have a look at R which is a pretty slick open source statistics package. Unfortunately its not used very much for marketing; most of the examples and freely available code is geared towards biochemistry, genetics etc.