Skip to content

Commit 7232e6a

Browse files
authored
Merge branch 'tensorflow:master' into fix_edges
2 parents 982e4f4 + 643a7d9 commit 7232e6a

File tree

167 files changed

+18489
-309
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

167 files changed

+18489
-309
lines changed

README.md

Lines changed: 19 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -71,38 +71,39 @@ learning with structured signals.
7171
target="_blank"><img src="http://img.youtube.com/vi/Js2WJkhdU7k/0.jpg"
7272
alt="Adversarial Learning" width="180" border="2" /></a>
7373

74-
We've also created the following hands-on colab-based tutorials that will allow
75-
you to interactively explore NSL:
74+
We've also created hands-on colab-based tutorials that will allow you to
75+
interactively explore NSL. Here are a few:
7676

7777
* [training with natural graphs](https://github.com/tensorflow/neural-structured-learning/blob/master/g3doc/tutorials/graph_keras_mlp_cora.ipynb)
7878
* [training with synthesized graphs](https://github.com/tensorflow/neural-structured-learning/blob/master/g3doc/tutorials/graph_keras_lstm_imdb.ipynb)
7979
* [adversarial learning](https://github.com/tensorflow/neural-structured-learning/blob/master/g3doc/tutorials/adversarial_keras_cnn_mnist.ipynb)
8080

81+
You can find more examples and tutorials under the
82+
[examples](neural_structured_learning/examples) directory.
83+
8184
## Contributing to NSL
8285

8386
Contributions are welcome and highly appreciated - there are several ways to
8487
contribute to TF Neural Structured Learning:
8588

86-
* Case studies. If you are interested in applying NSL, consider wrapping up
89+
* Case studies: If you are interested in applying NSL, consider wrapping up
8790
your usage as a tutorial, a new dataset, or an example model that others
88-
could use for experiments and/or development.
91+
could use for experiments and/or development. The [examples](examples)
92+
directory could be a good destination for such contributions.
8993

90-
* Product excellence. If you are interested in improving NSL's product
94+
* Product excellence: If you are interested in improving NSL's product
9195
excellence and developer experience, the best way is to clone this repo,
9296
make changes directly on the implementation in your local repo, and then
9397
send us pull request to integrate your changes.
9498

95-
* New algorithms. If you are interested in developing new algorithms for NSL,
99+
* New algorithms: If you are interested in developing new algorithms for NSL,
96100
the best way is to study the implementations of NSL libraries, and to think
97101
of extensions to the existing implementation (or alternative approaches). If
98102
you have a proposal for a new algorithm, we recommend starting by staging
99-
your project in the `research` directory and including a colab notebook to
100-
showcase the new features.
101-
102-
If you develop new algorithms in your own repository, we are happy to
103-
feature pointers to academic publications and/or repositories that use NSL,
104-
on
105-
[tensorflow.org/neural_structured_learning](http://www.tensorflow.org/neural_structured_learning).
103+
your project in the [research](research) directory and including a colab
104+
notebook to showcase the new features. If you develop new algorithms in your
105+
own repository, we would be happy to feature pointers to academic
106+
publications and/or repositories using NSL from this repository.
106107

107108
Please be sure to review the [contribution guidelines](CONTRIBUTING.md).
108109

@@ -128,6 +129,11 @@ please fill this
128129
[form](https://docs.google.com/forms/d/1AQEcPSgmwWBJj3H2haEytF4C_fr1aotWaHjCEXpPm2A);
129130
we would love to hear from you.
130131

132+
## Featured Usage
133+
134+
Please see the [usage page](usage.md) to learn more about how NSL is being
135+
discussed and used in the open source community.
136+
131137
## Release Notes
132138

133139
Please see the [release notes](RELEASE.md) for detailed version updates.

WORKSPACE

Lines changed: 9 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,14 @@ load("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
66
load("//research/carls/third_party/farmhash:workspace.bzl", farmhash = "repo")
77
farmhash()
88

9+
# leveldb
10+
load("//research/carls/third_party/leveldb:workspace.bzl", leveldb = "repo")
11+
leveldb()
12+
13+
# rocksdb
14+
load("//research/carls/third_party/rocksdb:workspace.bzl", rocksdb = "repo")
15+
rocksdb()
16+
917
# absl
1018
http_archive(
1119
name = "com_google_absl",
@@ -56,18 +64,13 @@ http_archive(
5664
# Use local tf to avoid error that tensorflow objects already registered.
5765
# Use custom protoc to make sure all protoc are built on the version of local tf
5866
# curl -L "https://github.com/protocolbuffers/protobuf/releases/download/v${PROTOC_VERSION}/protoc-${PROTOC_VERSION}-linux-x86_64.zip" | sha256
59-
load("//research/carls:bazel/repo.bzl", "cc_tf_configure", "carls_protoc_deps")
67+
load("//research/carls:bazel/repo.bzl", "cc_tf_configure")
6068
cc_tf_configure()
6169

6270
# Load protobuf compiler, protobuf and gRPC.
6371
# They MUST be in sync with TensorFlow's corresponding versions defined in
6472
# TENSORFLOW_DIR/tensorflow/workspace2.bzl
6573

66-
# Protobuf compiler.
67-
PROTOC_VERSION = "3.9.2"
68-
PROTOC_SHA256 = "0d9034a3b02bd77edf5ef926fb514819a0007f84252c5e6a6391ddfc4189b904"
69-
carls_protoc_deps(version = PROTOC_VERSION, sha256 = PROTOC_SHA256)
70-
7174
# Protobuffer
7275
http_archive(
7376
name = "com_google_protobuf",

g3doc/framework.md

Lines changed: 7 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -75,9 +75,9 @@ NSL brings the following advantages:
7575

7676
## Step-by-step Tutorials
7777

78-
To obtain hands-on experience with Neural Structured Learning, we have three
79-
tutorials that cover various scenarios where structured signals may be
80-
explicitly given, induced or constructed:
78+
To obtain hands-on experience with Neural Structured Learning, we have tutorials
79+
that cover various scenarios where structured signals may be explicitly given,
80+
constructed, or induced. Here are a few:
8181

8282
* [Graph regularization for document classification using natural graphs](tutorials/graph_keras_mlp_cora.ipynb).
8383
In this tutorial, we explore the use of graph regularization to classify
@@ -91,3 +91,7 @@ explicitly given, induced or constructed:
9191
In this tutorial, we explore the use of adversarial learning (where
9292
structured signals are induced) to classify images containing numeric
9393
digits.
94+
95+
More examples and tutorials can be found in the
96+
[examples](https://github.com/tensorflow/neural-structured-learning/tree/master/neural_structured_learning/examples)
97+
directory of our GitHub repository.

neural_structured_learning/BUILD

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -31,6 +31,7 @@ py_library(
3131
":version",
3232
"//neural_structured_learning/configs",
3333
"//neural_structured_learning/estimator",
34+
"//neural_structured_learning/experimental",
3435
"//neural_structured_learning/keras",
3536
"//neural_structured_learning/lib",
3637
"//neural_structured_learning/tools",

neural_structured_learning/__init__.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,7 @@
22

33
from neural_structured_learning import configs
44
from neural_structured_learning import estimator
5+
from neural_structured_learning import experimental
56
from neural_structured_learning import keras
67
from neural_structured_learning import lib
78
from neural_structured_learning import tools
Lines changed: 45 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,45 @@
1+
package(
2+
licenses = ["notice"], # Apache 2.0
3+
)
4+
5+
py_binary(
6+
name = "graph_keras_mlp_cora",
7+
srcs = ["graph_keras_mlp_cora.py"],
8+
python_version = "PY3",
9+
deps = [
10+
# package absl:app
11+
# package absl/flags
12+
# package absl/logging
13+
# package attr
14+
"//neural_structured_learning",
15+
# package tensorflow
16+
],
17+
)
18+
19+
py_binary(
20+
name = "graph_nets_cora_graph_regularization",
21+
srcs = ["graph_nets_cora_graph_regularization.py"],
22+
python_version = "PY3",
23+
deps = [
24+
# package absl:app
25+
# package absl/flags
26+
# package graph_nets
27+
"//neural_structured_learning",
28+
"//neural_structured_learning/experimental:gnn",
29+
# package tensorflow
30+
],
31+
)
32+
33+
py_binary(
34+
name = "graph_nets_cora_gcn",
35+
srcs = ["graph_nets_cora_gcn.py"],
36+
python_version = "PY3",
37+
deps = [
38+
# package absl:app
39+
# package absl/flags
40+
# package graph_nets
41+
"//neural_structured_learning",
42+
"//neural_structured_learning/experimental:gnn",
43+
# package tensorflow
44+
],
45+
)
Lines changed: 31 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,31 @@
1+
This directory contains examples and tutorials demonstrating how to use NSL.
2+
3+
## Example Trainers
4+
5+
The `.py` files in this directory are all example trainers runnable end-to-end
6+
as Python programs:
7+
8+
* `adv_keras_cnn_mnist.py`: Adversarial regularization on the MNIST dataset.
9+
* `graph_keras_mlp_cora.py`: Graph regularization on the Cora dataset.
10+
* `graph_nets_cora_gcn.py`: Graph Convolutional Network on the Cora dataset
11+
using [GraphNets](https://github.com/deepmind/graph_nets).
12+
* `graph_nets_cora_graph_regularization.py`: Graph regularization on the Cora
13+
dataset using [GraphNets](https://github.com/deepmind/graph_nets).
14+
15+
## Notebooks
16+
17+
The
18+
[notebooks](https://github.com/tensorflow/neural-structured-learning/tree/master/neural_structured_learning/examples/notebooks)
19+
subdirectory contains colab-based tutorials that allow you to explore NSL
20+
interactively. This subdirectory is generally where most new tutorials are
21+
added. Note that the tutorials shown on our
22+
[TensorFlow website](https://www.tensorflow.org/neural_structured_learning/framework#step-by-step_tutorials)
23+
are hosted under the
24+
[g3doc/tutorials](https://github.com/tensorflow/neural-structured-learning/tree/master/g3doc/tutorials)
25+
directory.
26+
27+
## Data Preprocessing Scripts
28+
29+
The
30+
[preprocess](https://github.com/tensorflow/neural-structured-learning/tree/master/neural_structured_learning/examples/preprocess)
31+
subdirectory contains various data preprocessing scripts.
Lines changed: 57 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,57 @@
1+
# Copyright 2019 Google LLC
2+
#
3+
# Licensed under the Apache License, Version 2.0 (the "License");
4+
# you may not use this file except in compliance with the License.
5+
# You may obtain a copy of the License at
6+
#
7+
# https://www.apache.org/licenses/LICENSE-2.0
8+
#
9+
# Unless required by applicable law or agreed to in writing, software
10+
# distributed under the License is distributed on an "AS IS" BASIS,
11+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12+
# See the License for the specific language governing permissions and
13+
# limitations under the License.
14+
"""Example of an NSL GNN."""
15+
from absl import app
16+
from absl import flags
17+
import graph_nets
18+
import neural_structured_learning as nsl
19+
from neural_structured_learning.experimental import gnn
20+
import tensorflow as tf
21+
22+
flags.DEFINE_string(
23+
'train_examples_path',
24+
None,
25+
'Path to training examples.')
26+
flags.DEFINE_string('eval_examples_path',
27+
None,
28+
'Path to evaluation examples.')
29+
30+
FLAGS = flags.FLAGS
31+
32+
33+
def main(argv):
34+
del argv
35+
neighbor_config = nsl.configs.GraphNeighborConfig(max_neighbors=3)
36+
train_dataset = gnn.make_cora_dataset(
37+
FLAGS.train_examples_path, shuffle=True, neighbor_config=neighbor_config)
38+
eval_dataset = gnn.make_cora_dataset(FLAGS.eval_examples_path, batch_size=32)
39+
40+
model = gnn.GraphConvolutionalNodeClassifier(
41+
seq_length=tf.data.experimental.get_structure(train_dataset)[0]
42+
['words'].shape[-1],
43+
num_classes=7)
44+
model.compile(
45+
optimizer=tf.keras.optimizers.Adam(),
46+
loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),
47+
metrics=[
48+
tf.keras.metrics.SparseCategoricalCrossentropy(from_logits=True),
49+
tf.keras.metrics.SparseCategoricalAccuracy(),
50+
tf.keras.metrics.SparseTopKCategoricalAccuracy(2),
51+
])
52+
model.fit(train_dataset, epochs=30, validation_data=eval_dataset)
53+
54+
55+
if __name__ == '__main__':
56+
graph_nets.compat.set_sonnet_version('2')
57+
app.run(main)
Lines changed: 89 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,89 @@
1+
# Copyright 2019 Google LLC
2+
#
3+
# Licensed under the Apache License, Version 2.0 (the "License");
4+
# you may not use this file except in compliance with the License.
5+
# You may obtain a copy of the License at
6+
#
7+
# https://www.apache.org/licenses/LICENSE-2.0
8+
#
9+
# Unless required by applicable law or agreed to in writing, software
10+
# distributed under the License is distributed on an "AS IS" BASIS,
11+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12+
# See the License for the specific language governing permissions and
13+
# limitations under the License.
14+
"""Example of Graph Regularization with an NSL GNN."""
15+
import functools
16+
17+
from absl import app
18+
from absl import flags
19+
import graph_nets
20+
import neural_structured_learning as nsl
21+
from neural_structured_learning.experimental import gnn
22+
import tensorflow as tf
23+
24+
flags.DEFINE_string(
25+
'train_examples_path',
26+
None,
27+
'Path to training examples.')
28+
flags.DEFINE_string('eval_examples_path',
29+
None,
30+
'Path to evaluation examples.')
31+
32+
FLAGS = flags.FLAGS
33+
34+
35+
class NodeClassifier(tf.keras.Model):
36+
"""Classifier model for nodes."""
37+
38+
def __init__(self,
39+
seq_length,
40+
num_classes,
41+
hidden_units=None,
42+
dropout_rate=0.5,
43+
**kwargs):
44+
inputs = tf.keras.Input(shape=(seq_length,), dtype=tf.int64, name='words')
45+
x = tf.keras.layers.Lambda(lambda x: tf.cast(x, tf.float32))(inputs)
46+
for num_units in (hidden_units or [50, 50]):
47+
x = tf.keras.layers.Dense(num_units, activation='relu')(x)
48+
x = tf.keras.layers.Dropout(dropout_rate)(x)
49+
outputs = tf.keras.layers.Dense(num_classes)(x)
50+
super(NodeClassifier, self).__init__(inputs, outputs, **kwargs)
51+
52+
53+
def main(argv):
54+
del argv
55+
graph_reg_config = nsl.configs.GraphRegConfig(
56+
neighbor_config=nsl.configs.GraphNeighborConfig(max_neighbors=3),
57+
multiplier=0.1,
58+
distance_config=nsl.configs.DistanceConfig(
59+
distance_type=nsl.configs.DistanceType.L2,
60+
reduction=tf.compat.v1.losses.Reduction.NONE,
61+
sum_over_axis=-1))
62+
63+
train_dataset = gnn.make_cora_dataset(
64+
FLAGS.train_examples_path,
65+
shuffle=True,
66+
neighbor_config=graph_reg_config.neighbor_config)
67+
eval_dataset = gnn.make_cora_dataset(FLAGS.eval_examples_path, batch_size=32)
68+
69+
model = gnn.GraphRegularizationModel(
70+
config=graph_reg_config,
71+
node_model_fn=functools.partial(
72+
NodeClassifier,
73+
seq_length=tf.data.experimental.get_structure(train_dataset)[0]
74+
['words'].shape[-1],
75+
num_classes=7))
76+
model.compile(
77+
optimizer=tf.keras.optimizers.Adam(),
78+
loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),
79+
metrics=[
80+
tf.keras.metrics.SparseCategoricalCrossentropy(from_logits=True),
81+
tf.keras.metrics.SparseCategoricalAccuracy(),
82+
tf.keras.metrics.SparseTopKCategoricalAccuracy(2),
83+
])
84+
model.fit(train_dataset, epochs=30, validation_data=eval_dataset)
85+
86+
87+
if __name__ == '__main__':
88+
graph_nets.compat.set_sonnet_version('2')
89+
app.run(main)

neural_structured_learning/examples/preprocess/cora/prep_data.sh

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,8 @@
1313
# See the License for the specific language governing permissions and
1414
# limitations under the License.
1515

16+
# Script to download and preprocess the Cora dataset for use by NSL models.
17+
1618
# URL for downloading Cora dataset.
1719
URL=https://linqs-data.soe.ucsc.edu/public/lbc/cora.tgz
1820

0 commit comments

Comments
 (0)