Well, the absolute easiest way to run Spark is to do it locally (e.g. you can brew install it on a Mac and just go) or to pay for a proprietary service like Databricks, which makes setting up a cluster take a few clicks.
That said, I think `flintrock launch my-cluster` is almost as easy as doing `pip install ...`.
You do need an AWS account and you do need to set your preferences like region and key name in a config file, but I don't see how you can get out of doing even that without subscribing to some managed service like Databricks that abstracts everything away and replaces it with a nice Web UI.
That said, I think `flintrock launch my-cluster` is almost as easy as doing `pip install ...`.
You do need an AWS account and you do need to set your preferences like region and key name in a config file, but I don't see how you can get out of doing even that without subscribing to some managed service like Databricks that abstracts everything away and replaces it with a nice Web UI.