Skip to content

Commit

Permalink
Merge pull request #70 from astrolabsoftware/0.1.4
Browse files Browse the repository at this point in the history
Bump version to v0.1.4
  • Loading branch information
JulienPeloton authored Jul 13, 2018
2 parents 02c8ef6 + c2d249b commit cd1e68c
Show file tree
Hide file tree
Showing 12 changed files with 40 additions and 20 deletions.
5 changes: 5 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,8 @@
## 0.1.4

- Unify the IO: One constructor to rule them all! ([PR](https://github.com/astrolabsoftware/spark3D/pull/69))
- Octree and Geometry Objects Bug Fixes ([PR](https://github.com/astrolabsoftware/spark3D/pull/67))

## 0.1.3

- Add KNN routines ([KNN](https://github.com/astrolabsoftware/spark3D/pull/59), [KNN](https://github.com/astrolabsoftware/spark3D/pull/60), [KNN](https://github.com/astrolabsoftware/spark3D/pull/62))
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
- [05/2018] **GSoC 2018**: spark3D has been selected to the Google Summer of Code (GSoC) 2018. Congratulation to [@mayurdb](https://github.com/mayurdb) who will work on the project this year!
- [06/2018] **Release**: version 0.1.0, 0.1.1
- [07/2018] **New location**: spark3D is an official project of [AstroLab Software](https://astrolabsoftware.github.io/)!
- [07/2018] **Release**: version 0.1.3
- [07/2018] **Release**: version 0.1.3, 0.1.4

## Installation and tutorials

Expand Down
2 changes: 1 addition & 1 deletion build.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ import xerial.sbt.Sonatype._
lazy val root = (project in file(".")).
settings(
inThisBuild(List(
version := "0.1.3"
version := "0.1.4"
)),
// Name of the application
name := "spark3D",
Expand Down
6 changes: 3 additions & 3 deletions docs/01_installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ another version, feel free to contact us. In addition to Spark, the library has
You can link spark3D to your project (either `spark-shell` or `spark-submit`) by specifying the coordinates:

```bash
toto:~$ spark-submit --packages "com.github.astrolabsoftware:spark3d_2.11:0.1.3" <...>
toto:~$ spark-submit --packages "com.github.astrolabsoftware:spark3d_2.11:0.1.4" <...>
```

It might not contain the latest features though (see *Building from source*).
Expand Down Expand Up @@ -69,7 +69,7 @@ result on the screen, plus details of the coverage at
First produce a jar of the spark3D library, and then launch a spark-shell by specifying the external dependencies:

```bash
toto:~$ JARS="target/scala-2.11/spark3d_2.11-0.1.3.jar,lib/jhealpix.jar"
toto:~$ JARS="target/scala-2.11/spark3d_2.11-0.1.4.jar,lib/jhealpix.jar"
toto:~$ PACKAGES="com.github.astrolabsoftware:spark-fits_2.11:0.6.0"
toto:~$ spark-shell --jars $JARS --packages $PACKAGES
```
Expand All @@ -83,7 +83,7 @@ scala> // etc...
Note that if you make a fat jar (that is building with `sbt assembly` and not `sbt package`), you do not need to specify external dependencies as they are already included in the resulting jar:

```bash
toto:~$ FATJARS="target/scala-2.11/spark3D-assembly-0.1.3.jar"
toto:~$ FATJARS="target/scala-2.11/spark3D-assembly-0.1.4.jar"
toto:~$ spark-shell --jars $FATJARS
```

Expand Down
6 changes: 4 additions & 2 deletions docs/02_introduction.md
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,8 @@ import com.astrolabsoftware.spark3d.spatial3DRDD.Point3DRDD
// We assume filename contains at least 3 columns whose names are `colnames`
// Order of columns in the file does not matter, as they will be re-aranged
// according to `colnames`.
val pointRDD = new Point3DRDD(spark: SparkSession, filename: String, colnames: String, isSpherical: Boolean, format: String, options: Map[String, String])
val pointRDD = new Point3DRDD(spark: SparkSession, filename: String, colnames: String,
isSpherical: Boolean, format: String, options: Map[String, String])
```

`format` and `options` control the correct reading of your data.
Expand Down Expand Up @@ -134,7 +135,8 @@ import com.astrolabsoftware.spark3d.spatial3DRDD.SphereRDD
// We assume filename contains at least 4 columns whose names are `colnames`.
// Order of columns in the file does not matter, as they will be re-aranged
// according to `colnames`.
val sphereRDD = new SphereRDD(spark: SparkSession, filename: String, colnames: String, isSpherical: Boolean, format: String, options: Map[String, String])
val sphereRDD = new SphereRDD(spark: SparkSession, filename: String, colnames: String,
isSpherical: Boolean, format: String, options: Map[String, String])
```

The resulting RDD is a `RDD[ShellEnvelope]`.
Expand Down
15 changes: 14 additions & 1 deletion docs/04_query.md
Original file line number Diff line number Diff line change
Expand Up @@ -155,7 +155,20 @@ For more details on the cross-match, see the following [notebook](https://github

## Neighbour search

TBD
Brute force KNN:

```scala
// Load the data
val pRDD = new Point3DRDD(spark, fn, columns, isSpherical, "csv", options)

// Centre object for the query
val queryObject = new Point3D(0.0, 0.0, 0.0, false)

// Find the `nNeighbours` closest neighbours
val knn = SpatialQuery.KNN(queryObject, pRDD.rawRDD, nNeighbours)
```

To come: partitioning + indexing!

## Benchmarks

Expand Down
2 changes: 1 addition & 1 deletion docs/_pages/home.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ header:
cta_url: "/docs/installation/"
caption:
intro:
- excerpt: '<p><font size="6">Spark extension for processing large-scale 3D data sets: Astrophysics, High Energy Physics, Meteorology, ...</font></p><br /><a href="https://github.com/astrolabsoftware/spark3D/releases/tag/0.1.3">Latest release v0.1.3</a>'
- excerpt: '<p><font size="6">Spark extension for processing large-scale 3D data sets: Astrophysics, High Energy Physics, Meteorology, ...</font></p><br /><a href="https://github.com/astrolabsoftware/spark3D/releases/tag/0.1.4">Latest release v0.1.4</a>'
excerpt: '{::nomarkdown}<iframe style="display: inline-block;" src="https://ghbtns.com/github-btn.html?user=astrolabsoftware&repo=spark3D&type=star&count=true&size=large" frameborder="0" scrolling="0" width="160px" height="30px"></iframe> <iframe style="display: inline-block;" src="https://ghbtns.com/github-btn.html?user=astrolabsoftware&repo=spark3D&type=fork&count=true&size=large" frameborder="0" scrolling="0" width="158px" height="30px"></iframe>{:/nomarkdown}'
feature_row:
- image_path:
Expand Down
2 changes: 1 addition & 1 deletion docs/about.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ Several goals have to be undertaken in this project:
+ indexing mechanisms
+ ways to define a metric (ie. distance between objects)
+ selection capability of objects or objects within a region
- work with as many input file format as possible (CSV, JSON, FITS, and so on)
- work with as many input file format as possible (CSV, ROOT, FITS, HDF5 and so on)
- package the developments into an open-source library.

## Support
Expand Down
6 changes: 3 additions & 3 deletions examples/jupyter/CrossMatch.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -75,11 +75,11 @@
"name": "stdout",
"output_type": "stream",
"text": [
"Marking com.github.astrolabsoftware:spark-fits_2.11:0.4.0 for download\n",
"Marking com.github.astrolabsoftware:spark-fits_2.11:0.6.0 for download\n",
"Preparing to fetch from:\n",
"-> file:/var/folders/my/lfvl285927q2hzk545f39sy40000gn/T/toree_add_deps7614799531666607290/\n",
"-> https://repo1.maven.org/maven2\n",
"-> New file at /var/folders/my/lfvl285927q2hzk545f39sy40000gn/T/toree_add_deps7614799531666607290/https/repo1.maven.org/maven2/com/github/astrolabsoftware/spark-fits_2.11/0.4.0/spark-fits_2.11-0.4.0.jar\n",
"-> New file at /var/folders/my/lfvl285927q2hzk545f39sy40000gn/T/toree_add_deps7614799531666607290/https/repo1.maven.org/maven2/com/github/astrolabsoftware/spark-fits_2.11/0.6.0/spark-fits_2.11-0.6.0.jar\n",
"Marking com.github.haifengl:smile-plot:1.5.1 for download\n",
"Preparing to fetch from:\n",
"-> file:/var/folders/my/lfvl285927q2hzk545f39sy40000gn/T/toree_add_deps7614799531666607290/\n",
Expand Down Expand Up @@ -113,7 +113,7 @@
],
"source": [
"// Package to read data from FITS file\n",
"%AddDeps com.github.astrolabsoftware spark-fits_2.11 0.4.0\n",
"%AddDeps com.github.astrolabsoftware spark-fits_2.11 0.6.0\n",
"\n",
"// Smile provides visualisation tools\n",
"%AddDeps com.github.haifengl smile-plot 1.5.1\n",
Expand Down
6 changes: 3 additions & 3 deletions examples/jupyter/onion_partitioning.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -24,11 +24,11 @@
"name": "stdout",
"output_type": "stream",
"text": [
"Marking com.github.astrolabsoftware:spark-fits_2.11:0.3.0 for download\n",
"Marking com.github.astrolabsoftware:spark-fits_2.11:0.6.0 for download\n",
"Preparing to fetch from:\n",
"-> file:/var/folders/my/lfvl285927q2hzk545f39sy40000gn/T/toree_add_deps3353854346753658887/\n",
"-> https://repo1.maven.org/maven2\n",
"-> New file at /var/folders/my/lfvl285927q2hzk545f39sy40000gn/T/toree_add_deps3353854346753658887/https/repo1.maven.org/maven2/com/github/astrolabsoftware/spark-fits_2.11/0.3.0/spark-fits_2.11-0.3.0.jar\n",
"-> New file at /var/folders/my/lfvl285927q2hzk545f39sy40000gn/T/toree_add_deps3353854346753658887/https/repo1.maven.org/maven2/com/github/astrolabsoftware/spark-fits_2.11/0.6.0/spark-fits_2.11-0.6.0.jar\n",
"Marking com.github.haifengl:smile-plot:1.5.1 for download\n",
"Preparing to fetch from:\n",
"-> file:/var/folders/my/lfvl285927q2hzk545f39sy40000gn/T/toree_add_deps3353854346753658887/\n",
Expand Down Expand Up @@ -62,7 +62,7 @@
],
"source": [
"// Package to read data from FITS file\n",
"%AddDeps com.github.astrolabsoftware spark-fits_2.11 0.3.0\n",
"%AddDeps com.github.astrolabsoftware spark-fits_2.11 0.6.0\n",
"\n",
"// Smile provides visualisation tools\n",
"%AddDeps com.github.haifengl smile-plot 1.5.1\n",
Expand Down
4 changes: 2 additions & 2 deletions run_scala.sh
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ SBT_VERSION=2.11.8
SBT_VERSION_SPARK=2.11

## Package version
VERSION=0.1.3
VERSION=0.1.4

# Package it
sbt ++${SBT_VERSION} package
Expand All @@ -31,7 +31,7 @@ display="show"

## Dependencies
jars="lib/jhealpix.jar,lib/swingx-0.9.1.jar"
packages="com.github.astrolabsoftware:spark-fits_2.11:0.4.0,com.github.haifengl:smile-core:1.5.1,com.github.haifengl:smile-plot:1.5.1,com.github.haifengl:smile-math:1.5.1,com.github.haifengl:smile-scala_2.11:1.5.1"
packages="com.github.astrolabsoftware:spark-fits_2.11:0.6.0,com.github.haifengl:smile-core:1.5.1,com.github.haifengl:smile-plot:1.5.1,com.github.haifengl:smile-math:1.5.1,com.github.haifengl:smile-scala_2.11:1.5.1"

# Run it!
spark-submit \
Expand Down
4 changes: 2 additions & 2 deletions run_xmatch_cluster.sh
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ SBT_VERSION=2.11.8
SBT_VERSION_SPARK=2.11

## Package version
VERSION=0.1.3
VERSION=0.1.4

# Package it
sbt ++${SBT_VERSION} package
Expand All @@ -36,7 +36,7 @@ kind="healpix"

## Dependencies
jars="lib/jhealpix.jar"
packages="com.github.astrolabsoftware:spark-fits_2.11:0.4.0"
packages="com.github.astrolabsoftware:spark-fits_2.11:0.6.0"

# Run it!
spark-submit \
Expand Down

0 comments on commit cd1e68c

Please sign in to comment.