Skip to content

Commit

Permalink
Merge pull request #91 from astrolabsoftware/0.2.1
Browse files Browse the repository at this point in the history
Bump version 0.2.0 -> 0.2.1
  • Loading branch information
JulienPeloton authored Aug 10, 2018
2 parents f2df03e + 3692215 commit 8746a85
Show file tree
Hide file tree
Showing 22 changed files with 101 additions and 67 deletions.
5 changes: 5 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,8 @@
## 0.2.1

- pyspark3d contains all the features of spark3D ([partitioning](https://github.com/astrolabsoftware/spark3D/pull/89), [operators](https://github.com/astrolabsoftware/spark3D/pull/90))
- Scala reflection support for python using py4j ([PR](https://github.com/astrolabsoftware/spark3D/pull/90))

## 0.2.0

- Add support for Python: pyspark3d ([PR](https://github.com/astrolabsoftware/spark3D/pull/86)).
Expand Down
5 changes: 2 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,10 +10,9 @@
- [06/2018] **Release**: version 0.1.0, 0.1.1
- [07/2018] **New location**: spark3D is an official project of [AstroLab Software](https://astrolabsoftware.github.io/)!
- [07/2018] **Release**: version 0.1.3, 0.1.4, 0.1.5
- [08/2018] **Release**: version 0.2.0 (pyspark3d)
- [08/2018] **Release**: version 0.2.0, 0.2.1 (pyspark3d)

<p align="center"><img width="400" src="https://github.com/astrolabsoftware/spark3D/raw/master/pic/spark3d_lib_0.2.0.png"/>
<img width="400" src="https://github.com/astrolabsoftware/spark3D/raw/master/pic/pyspark3d_lib_0.2.0.png"/>
<p align="center"><img width="500" src="https://github.com/astrolabsoftware/spark3D/raw/master/pic/spark3d_lib_0.2.1.png"/>
</p>

## Installation and tutorials
Expand Down
2 changes: 1 addition & 1 deletion build.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ import xerial.sbt.Sonatype._
lazy val root = (project in file(".")).
settings(
inThisBuild(List(
version := "0.2.0"
version := "0.2.1"
)),
// Name of the application
name := "spark3D",
Expand Down
45 changes: 31 additions & 14 deletions docs/01_installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ another version, feel free to contact us. In addition to Spark, the library has
You can link spark3D to your project (either `spark-shell` or `spark-submit`) by specifying the coordinates:

```bash
toto:~$ spark-submit --packages "com.github.astrolabsoftware:spark3d_2.11:0.2.0" <...>
toto:~$ spark-submit --packages "com.github.astrolabsoftware:spark3d_2.11:0.2.1" <...>
```

It might not contain the latest features though (see *Building from source*).
Expand Down Expand Up @@ -60,7 +60,7 @@ To launch the test suite, just execute:
toto:~$ sbt ++${SCALA_VERSION} coverage test coverageReport
```

We also provide a script (test.sh) that you can execute. You should get the
We also provide a script (test_scala.sh) that you can execute. You should get the
result on the screen, plus details of the coverage at
`target/scala_${SCALA_VERSION}/scoverage-report/index.html`.

Expand All @@ -69,7 +69,7 @@ result on the screen, plus details of the coverage at
First produce a jar of the spark3D library, and then launch a spark-shell by specifying the external dependencies:

```bash
toto:~$ JARS="target/scala-2.11/spark3d_2.11-0.2.0.jar,lib/jhealpix.jar"
toto:~$ JARS="target/scala-2.11/spark3d_2.11-0.2.1.jar,lib/jhealpix.jar"
toto:~$ PACKAGES="com.github.astrolabsoftware:spark-fits_2.11:0.6.0"
toto:~$ spark-shell --jars $JARS --packages $PACKAGES
```
Expand All @@ -80,10 +80,10 @@ You will be able to import anything from spark3D
scala> import com.astrolabsoftware.spark3d.geometryObjects.Point3D
scala> // etc...
```
Note that if you make a fat jar (that is building with `sbt assembly` and not `sbt package`), you do not need to specify external dependencies as they are already included in the resulting jar:
Note that if you make a fat jar (that is building with `sbt ++${SCALA_VERSION} assembly` and not `sbt ++${SCALA_VERSION} package`), you do not need to specify external dependencies as they are already included in the resulting jar:

```bash
toto:~$ FATJARS="target/scala-2.11/spark3D-assembly-0.2.0.jar"
toto:~$ FATJARS="target/scala-2.11/spark3D-assembly-0.2.1.jar"
toto:~$ spark-shell --jars $FATJARS
```

Expand All @@ -102,7 +102,7 @@ objects in the Java Virtual Machine, and then Python programs running in a Pytho

## Requirements

pyspark is tested on Python 3.5 and later.
pyspark3d is tested on Python 3.5 and later.
**Note: pyspark3d will not run for python 3.4 and earlier (incl. 2.X).** The reason is
that we make use of [type hints](https://www.python.org/dev/peps/pep-0484/):
```
Expand Down Expand Up @@ -146,7 +146,7 @@ Edit the `pyspark3d_conf.py` with the newly created JAR:
version = __version__

# Scala version used to compile spark3D
scala_version = "2.11"
scala_version = __scala_version__

...

Expand Down Expand Up @@ -197,7 +197,7 @@ First produce a FAT JAR of the spark3D library (see above), and then launch a sh

```bash
toto:~$ PYSPARK_DRIVER_PYTHON=ipython pyspark \
--jars /path/to/target/scala-2.11/spark3D-assembly-0.2.0.jar \
--jars /path/to/target/scala-2.11/spark3D-assembly-0.2.1.jar \
--packages com.github.astrolabsoftware:spark-fits_2.11:0.6.0
```

Expand All @@ -209,7 +209,7 @@ In [2]: from pyspark3d.geometryObjects import Point3D
In [3]: Point3D?
Signature: Point3D(x:float, y:float, z:float, isSpherical:bool) -> py4j.java_gateway.JavaObject
Docstring:
Binding arount Point3D.scala. For full description,
Binding around Point3D.scala. For full description,
see `$spark3d/src/main/scala/com/spark3d/geometryObjects/Point3D.scala`.

By default, the input coordinates are supposed euclidean,
Expand Down Expand Up @@ -247,16 +247,33 @@ You can then call the method associated, for example
>>> p3d.getVolume()
0.0

Convert the (theta, phi) in Healpix pixel index:
Return the point coordinates
>>> p3d = Point3D(1.0, 1.0, 0.0, False)
>>> p3d.getCoordinatePython()
[1.0, 1.0, 0.0]

It will be a JavaList by default
>>> coord = p3d.getCoordinatePython()
>>> print(type(coord))
<class 'py4j.java_collections.JavaList'>

Make it a python list
>>> coord_python = list(coord)
>>> print(type(coord_python))
<class 'list'>

[Astro] Convert the (theta, phi) in Healpix pixel index:
>>> p3d = Point3D(1.0, np.pi, 0.0, True) # (z, theta, phi)
>>> p3d.toHealpix(2048, True)
50331644

To see all the available methods:
>>> print(sorted(p3d.__dir__())) # doctest: +NORMALIZE_WHITESPACE
['center', 'distanceTo', 'equals', 'getClass', 'getCoordinate',
'getEnvelope', 'getHash', 'getVolume', 'hasCenterCloseTo', 'hashCode',
'intersects', 'isEqual', 'isSpherical', 'notify', 'notifyAll', 'toHealpix',
'toHealpix$default$2', 'toString', 'wait', 'x', 'y', 'z']
'getCoordinatePython', 'getEnvelope', 'getHash', 'getVolume',
'hasCenterCloseTo', 'hashCode', 'intersects', 'isEqual', 'isSpherical',
'notify', 'notifyAll', 'toHealpix', 'toHealpix$default$2', 'toString',
'wait', 'x', 'y', 'z']
File: ~/Documents/workspace/myrepos/spark3D/pyspark3d/geometryObjects.py
Type: function
```
Expand All @@ -268,5 +285,5 @@ You can follow the different tutorials:
- Space partitioning [Scala]({{ site.baseurl }}{% link 03_partitioning_scala.md %}), [Python]({{ site.baseurl }}{% link 03_partitioning_python.md %})
- Query [Scala]({{ site.baseurl }}{% link 04_query_scala.md %}), [Python]({{ site.baseurl }}{% link 04_query_python.md %})

We also include [Scala examples](https://github.com/astrolabsoftware/spark3D/tree/master/src/main/scala/com/spark3d/examples) and runners (`run_*.sh`) in the root folder of the repo.
We also include [Scala examples](https://github.com/astrolabsoftware/spark3D/tree/master/src/main/scala/com/spark3d/examples) and runners (`run_*.sh`) in the folder runners of the repo.
You might have to modify those scripts with your environment.
6 changes: 0 additions & 6 deletions docs/_data/navigation.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,9 +8,3 @@ main:
url: https://astrolabsoftware.github.io/
- title: "Fork me!"
url: https://github.com/astrolabsoftware/spark3D
# - title: "Sample Posts"
# url: /year-archive/
# - title: "Sample Collections"
# url: /collection-archive/
# - title: "Sitemap"
# url: /sitemap/
2 changes: 1 addition & 1 deletion docs/_pages/home.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ header:
cta_url: "/docs/installation/"
caption:
intro:
- excerpt: '<p><font size="6">Spark extension for processing large-scale 3D data sets: Astrophysics, High Energy Physics, Meteorology, ...</font></p><br /><a href="https://github.com/astrolabsoftware/spark3D/releases/tag/0.2.0">Latest release v0.2.0</a>'
- excerpt: '<p><font size="6">Spark extension for processing large-scale 3D data sets: Astrophysics, High Energy Physics, Meteorology, ...</font></p><br /><a href="https://github.com/astrolabsoftware/spark3D/releases/tag/0.2.1">Latest release v0.2.1</a>'
excerpt: '{::nomarkdown}<iframe style="display: inline-block;" src="https://ghbtns.com/github-btn.html?user=astrolabsoftware&repo=spark3D&type=star&count=true&size=large" frameborder="0" scrolling="0" width="160px" height="30px"></iframe> <iframe style="display: inline-block;" src="https://ghbtns.com/github-btn.html?user=astrolabsoftware&repo=spark3D&type=fork&count=true&size=large" frameborder="0" scrolling="0" width="158px" height="30px"></iframe>{:/nomarkdown}'
feature_row:
- image_path:
Expand Down
4 changes: 2 additions & 2 deletions docs/about.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,12 +39,12 @@ Several goals have to be undertaken in this project:
+ ways to define a metric (ie. distance between objects)
+ selection capability of objects or objects within a region
- work with as many input file format as possible (CSV, ROOT, FITS, HDF5 and so on)
- Expose several API: Scala and Python at least!
- package the developments into an open-source library.

## Current structure

<p align="center"><img width="400" src="https://github.com/astrolabsoftware/spark3D/raw/master/pic/spark3d_lib_0.2.0.png"/>
<img width="400" src="https://github.com/astrolabsoftware/spark3D/raw/master/pic/pyspark3d_lib_0.2.0.png"/>
<p align="center"><img width="500" src="https://github.com/astrolabsoftware/spark3D/raw/master/pic/spark3d_lib_0.2.1.png"/>
</p>

## Support
Expand Down
Binary file modified pic/spark3d_lib_0.2.1.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
4 changes: 2 additions & 2 deletions pyspark3d/pyspark3d_conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
# limitations under the License.
import os
from pathlib import Path
from version import __version__
from version import __version__, __scala_version__

# For local tests
path_to_conf = Path().cwd().as_uri()
Expand All @@ -25,7 +25,7 @@
version = __version__

# Scala version used to compile spark3D
scala_version = "2.11"
scala_version = __scala_version__

# Verbosity for Spark
log_level = "WARN"
Expand Down
2 changes: 1 addition & 1 deletion pyspark3d/version.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# pyspark3d and spark3D have the same version number
__version__ = "0.2.0"
__version__ = "0.2.1"
__scala_version__ = "2.11"
__scala_version_all__ = "2.11.8"
2 changes: 1 addition & 1 deletion runners/benchmark_knn.sh
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ SBT_VERSION=2.11.8
SBT_VERSION_SPARK=2.11

## Package version
VERSION=0.2.0
VERSION=0.2.1

# Package it
sbt ++${SBT_VERSION} package
Expand Down
2 changes: 1 addition & 1 deletion runners/run_knn.sh
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ SBT_VERSION=2.11.8
SBT_VERSION_SPARK=2.11

## Package version
VERSION=0.2.0
VERSION=0.2.1

# Package it
sbt ++${SBT_VERSION} package
Expand Down
2 changes: 1 addition & 1 deletion runners/run_knnGeo.sh
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ SBT_VERSION=2.11.8
SBT_VERSION_SPARK=2.11

## Package version
VERSION=0.2.0
VERSION=0.2.1

# Package it
sbt ++${SBT_VERSION} package
Expand Down
2 changes: 1 addition & 1 deletion runners/run_part.sh
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ SBT_VERSION=2.11.8
SBT_VERSION_SPARK=2.11

## Package version
VERSION=0.2.0
VERSION=0.2.1

# Package it
sbt ++${SBT_VERSION} package
Expand Down
2 changes: 1 addition & 1 deletion runners/run_scala.sh
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ SBT_VERSION=2.11.8
SBT_VERSION_SPARK=2.11

## Package version
VERSION=0.2.0
VERSION=0.2.1

# Package it
sbt ++${SBT_VERSION} package
Expand Down
2 changes: 1 addition & 1 deletion runners/run_shuffle.sh
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ SBT_VERSION=2.11.8
SBT_VERSION_SPARK=2.11

## Package version
VERSION=0.2.0
VERSION=0.2.1

# Package it
sbt ++${SBT_VERSION} package
Expand Down
2 changes: 1 addition & 1 deletion runners/run_xmatch.sh
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ SBT_VERSION=2.11.8
SBT_VERSION_SPARK=2.11

## Package version
VERSION=0.2.0
VERSION=0.2.1

# Package it
sbt ++${SBT_VERSION} package
Expand Down
2 changes: 1 addition & 1 deletion runners/run_xmatch_cluster.sh
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ SBT_VERSION=2.11.8
SBT_VERSION_SPARK=2.11

## Package version
VERSION=0.2.0
VERSION=0.2.1

# Package it
sbt ++${SBT_VERSION} package
Expand Down
2 changes: 1 addition & 1 deletion runners/run_xmatch_geo.sh
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ SBT_VERSION=2.11.8
SBT_VERSION_SPARK=2.11

## Package version
VERSION=0.2.0
VERSION=0.2.1

# Package it
sbt ++${SBT_VERSION} package
Expand Down
38 changes: 27 additions & 11 deletions setup.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,17 @@
#!/usr/bin/env python
# Copyright 2018 Julien Peloton
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# -*- coding: utf-8 -*-

import os
Expand Down Expand Up @@ -30,24 +43,25 @@


class jar_build(build):
""" Class to handle spark3D JAR while installing pyspark3d """
def run(self):
"""
Compile the companion jar file.
Override distutils.command.build.
Compile the companion library and produce a FAT jar.
"""

if find_executable('sbt') is None:
raise EnvironmentError("""
The executable "sbt" cannot be found.
Please install the "sbt" tool to build the companion jar file.
""")
The executable "sbt" cannot be found.
Please install the "sbt" tool to build the companion jar file.
""")

build.run(self)
subprocess.check_call(
"sbt ++{} assembly".format(SCALA_VERSION_ALL), shell=True)


class jar_clean(clean):
""" Extends distutils.command.clean """
def run(self):
"""
Cleans the scala targets from the system.
Expand All @@ -57,7 +71,13 @@ def run(self):


class my_sdist(sdist):
""" Extends distutils.command.sdist """
def initialize_options(self, *args, **kwargs):
"""
During installation, open the MANIFEST file
and insert the path to the spark3D JAR required
to run pyspark3d.
"""
here = os.path.dirname(os.path.abspath(__file__))
filename = os.path.join(here, "MANIFEST.in")
with open(filename, 'w') as f:
Expand All @@ -82,7 +102,6 @@ def initialize_options(self, *args, **kwargs):
classifiers=[
'Development Status :: 2 - Pre-Alpha',
'Intended Audience :: Developers',
'Natural Language :: English',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.6',
],
Expand All @@ -100,8 +119,5 @@ def initialize_options(self, *args, **kwargs):
'src'
]
},
# data_files=[
# ('share/py4jdbc', [ASSEMBLY_JAR])
# ],
setup_requires=setup_requirements
)
Loading

0 comments on commit 8746a85

Please sign in to comment.