commit
stringlengths 40
40
| old_file
stringlengths 4
237
| new_file
stringlengths 4
237
| old_contents
stringlengths 1
4.24k
| new_contents
stringlengths 5
4.84k
| subject
stringlengths 15
778
| message
stringlengths 16
6.86k
| lang
stringlengths 1
30
| license
stringclasses 13
values | repos
stringlengths 5
116k
| config
stringlengths 1
30
| content
stringlengths 105
8.72k
|
---|---|---|---|---|---|---|---|---|---|---|---|
a7d5c9868dc9ba3f7175351269002b9c9330ae04 | README.md | README.md | HBase Indexer
=============
HBase Indexer allows you to easily and quickly index HBase rows into Solr.
Usage documentation can be found on the hbase-indexer Wiki -
http://github.com/NGDATA/hbase-indexer/wiki.
## Subprojects
### HBase SEP
A standalone library for asynchronously processing HBase mutation events
by hooking into HBase replication, see [the SEP readme](hbase-sep/README.md).
### HBase SEP & replication monitoring
A standalone utility to monitor HBase replication progress,
see [the SEP-tools readme](hbase-sep/hbase-sep-tools/README.md).
## Building
You can build the full hbase-indexer project as follows:
mvn clean install -DskipTests
**The profile 0.98, 0.96, 0.94 cannot be used at the moment since the code was modified to work with HBase 1.1.0**
The default build is linked to HBase 0.94. In order to build for HBase 0.98,
run the following command:
mvn clean install -DskipTests -Dhbase.api=0.98
The default build is linked to HBase 0.94. In order to build for HBase 1.1.0,
run the following command:
mvn clean install -DskipTests -Dhbase.api=1.1.0
| HBase Indexer
=============
HBase Indexer allows you to easily and quickly index HBase rows into Solr.
Usage documentation can be found on the hbase-indexer Wiki -
http://github.com/NGDATA/hbase-indexer/wiki.
## Subprojects
### HBase SEP
A standalone library for asynchronously processing HBase mutation events
by hooking into HBase replication, see [the SEP readme](hbase-sep/README.md).
### HBase SEP & replication monitoring
A standalone utility to monitor HBase replication progress,
see [the SEP-tools readme](hbase-sep/hbase-sep-tools/README.md).
## Building
You can build the full hbase-indexer project as follows:
mvn clean install -DskipTests
**The profile 0.98, 0.96, 0.94 cannot be used at the moment since the code was modified to work with HBase 1.1.0**
The default build is linked to HBase 0.94. In order to build for HBase 0.98,
run the following command:
mvn clean install -DskipTests -Dhbase.api=0.98
The default build is linked to HBase 0.94. In order to build for HBase 1.1.0,
run the following command:
mvn clean install -DskipTests -Dhbase.api=1.1.0
### Building Hortonworks specific
In order to build the full hbase-indexer project with an specific version of Hortonworks.
mvn clean package -Pdist -DskipTests -Dhbase.api=1.1.2 -Dhdp.version=".2.4.2.0-258"
**The profile 1.1.2 can be used with the hdp version suffix. (`-Dhdp.version`)**
| Add build an specific version of Hortonworks. | LWSHADOOP-474: Add build an specific version of Hortonworks.
| Markdown | apache-2.0 | LucidWorks/hbase-indexer,LucidWorks/hbase-indexer | markdown | ## Code Before:
HBase Indexer
=============
HBase Indexer allows you to easily and quickly index HBase rows into Solr.
Usage documentation can be found on the hbase-indexer Wiki -
http://github.com/NGDATA/hbase-indexer/wiki.
## Subprojects
### HBase SEP
A standalone library for asynchronously processing HBase mutation events
by hooking into HBase replication, see [the SEP readme](hbase-sep/README.md).
### HBase SEP & replication monitoring
A standalone utility to monitor HBase replication progress,
see [the SEP-tools readme](hbase-sep/hbase-sep-tools/README.md).
## Building
You can build the full hbase-indexer project as follows:
mvn clean install -DskipTests
**The profile 0.98, 0.96, 0.94 cannot be used at the moment since the code was modified to work with HBase 1.1.0**
The default build is linked to HBase 0.94. In order to build for HBase 0.98,
run the following command:
mvn clean install -DskipTests -Dhbase.api=0.98
The default build is linked to HBase 0.94. In order to build for HBase 1.1.0,
run the following command:
mvn clean install -DskipTests -Dhbase.api=1.1.0
## Instruction:
LWSHADOOP-474: Add build an specific version of Hortonworks.
## Code After:
HBase Indexer
=============
HBase Indexer allows you to easily and quickly index HBase rows into Solr.
Usage documentation can be found on the hbase-indexer Wiki -
http://github.com/NGDATA/hbase-indexer/wiki.
## Subprojects
### HBase SEP
A standalone library for asynchronously processing HBase mutation events
by hooking into HBase replication, see [the SEP readme](hbase-sep/README.md).
### HBase SEP & replication monitoring
A standalone utility to monitor HBase replication progress,
see [the SEP-tools readme](hbase-sep/hbase-sep-tools/README.md).
## Building
You can build the full hbase-indexer project as follows:
mvn clean install -DskipTests
**The profile 0.98, 0.96, 0.94 cannot be used at the moment since the code was modified to work with HBase 1.1.0**
The default build is linked to HBase 0.94. In order to build for HBase 0.98,
run the following command:
mvn clean install -DskipTests -Dhbase.api=0.98
The default build is linked to HBase 0.94. In order to build for HBase 1.1.0,
run the following command:
mvn clean install -DskipTests -Dhbase.api=1.1.0
### Building Hortonworks specific
In order to build the full hbase-indexer project with an specific version of Hortonworks.
mvn clean package -Pdist -DskipTests -Dhbase.api=1.1.2 -Dhdp.version=".2.4.2.0-258"
**The profile 1.1.2 can be used with the hdp version suffix. (`-Dhdp.version`)**
|
f253feac7a4c53bd17958b0c74adbec528ae2e17 | rethinkdb/setup-rethinkdb.py | rethinkdb/setup-rethinkdb.py | import rethinkdb as r
import argparse
parser = argparse.ArgumentParser(description='Set up RethinkDB locally')
args = parser.parse_args()
conn = r.connect()
r.db_create('muzhack').run(conn)
r.db('muzhack').table_create('users').run(conn)
r.db('muzhack').table_create('projects').run(conn)
r.db('muzhack').table_create('resetPasswordTokens').run(conn)
| import rethinkdb as r
import argparse
parser = argparse.ArgumentParser(description='Set up RethinkDB')
parser.add_argument('-H', '--host', default='localhost', help='Specify host')
args = parser.parse_args()
conn = r.connect(host=args.host)
r.db_create('muzhack').run(conn)
r.db('muzhack').table_create('users').run(conn)
r.db('muzhack').table_create('projects').run(conn)
r.db('muzhack').table_create('resetPasswordTokens').run(conn)
| Allow setting up rethinkdb remotely | Allow setting up rethinkdb remotely
| Python | mit | muzhack/musitechhub,muzhack/musitechhub,muzhack/musitechhub,muzhack/muzhack,muzhack/muzhack,muzhack/musitechhub,muzhack/muzhack,muzhack/muzhack | python | ## Code Before:
import rethinkdb as r
import argparse
parser = argparse.ArgumentParser(description='Set up RethinkDB locally')
args = parser.parse_args()
conn = r.connect()
r.db_create('muzhack').run(conn)
r.db('muzhack').table_create('users').run(conn)
r.db('muzhack').table_create('projects').run(conn)
r.db('muzhack').table_create('resetPasswordTokens').run(conn)
## Instruction:
Allow setting up rethinkdb remotely
## Code After:
import rethinkdb as r
import argparse
parser = argparse.ArgumentParser(description='Set up RethinkDB')
parser.add_argument('-H', '--host', default='localhost', help='Specify host')
args = parser.parse_args()
conn = r.connect(host=args.host)
r.db_create('muzhack').run(conn)
r.db('muzhack').table_create('users').run(conn)
r.db('muzhack').table_create('projects').run(conn)
r.db('muzhack').table_create('resetPasswordTokens').run(conn)
|
eaa588dfaf02a9d14f339461bb7091b3b67d36a0 | packages.d/init-auto-complete.el | packages.d/init-auto-complete.el | (add-to-list 'ac-dictionary-directories "~/.emacs.d/ac-dict")
(require 'auto-complete-config)
(ac-config-default)
(define-key ac-mode-map (kbd "M-TAB") 'auto-complete)
| (require 'auto-complete-config)
(ac-config-default)
(add-to-list 'ac-dictionary-directories "~/.emacs.d/ac-dict")
(define-key ac-mode-map (kbd "M-TAB") 'auto-complete)
| Fix order issue for AC setup | Fix order issue for AC setup
| Emacs Lisp | mit | lstoll/dotfiles,lstoll/repo,lstoll/dotfiles,lstoll/dotfiles,lstoll/repo,lstoll/repo | emacs-lisp | ## Code Before:
(add-to-list 'ac-dictionary-directories "~/.emacs.d/ac-dict")
(require 'auto-complete-config)
(ac-config-default)
(define-key ac-mode-map (kbd "M-TAB") 'auto-complete)
## Instruction:
Fix order issue for AC setup
## Code After:
(require 'auto-complete-config)
(ac-config-default)
(add-to-list 'ac-dictionary-directories "~/.emacs.d/ac-dict")
(define-key ac-mode-map (kbd "M-TAB") 'auto-complete)
|
8b6f8302fabafecc5a9d1810f164a42a0b1ebf37 | src/main/resources/static/css/main.css | src/main/resources/static/css/main.css | body {
background-color: lightblue;
font-family: sans-serif;
}
table {
border: 1px solid black;
}
th {
font-size: 150%;
text-align: left;
background-color: aquamarine;
padding: 10px;
}
td {
text-align: left;
background-color: azure;
padding: 4px;
} | body {
background-color: lightblue;
font-family: sans-serif;
}
table {
border: 1px solid black;
}
th {
font-size: 150%;
text-align: left;
background-color: aquamarine;
padding: 10px;
}
td {
text-align: left;
background-color: azure;
padding: 4px;
}
a, a:visited {
color: inherit;
text-decoration: none;
font-weight: bolder;
}
a:hover {
color: olivedrab;
} | Tweak CSS for links on form. | Tweak CSS for links on form.
| CSS | unlicense | TonyJenkins/shiny-barnacle,TonyJenkins/shiny-barnacle | css | ## Code Before:
body {
background-color: lightblue;
font-family: sans-serif;
}
table {
border: 1px solid black;
}
th {
font-size: 150%;
text-align: left;
background-color: aquamarine;
padding: 10px;
}
td {
text-align: left;
background-color: azure;
padding: 4px;
}
## Instruction:
Tweak CSS for links on form.
## Code After:
body {
background-color: lightblue;
font-family: sans-serif;
}
table {
border: 1px solid black;
}
th {
font-size: 150%;
text-align: left;
background-color: aquamarine;
padding: 10px;
}
td {
text-align: left;
background-color: azure;
padding: 4px;
}
a, a:visited {
color: inherit;
text-decoration: none;
font-weight: bolder;
}
a:hover {
color: olivedrab;
} |
a65f603a2f1169b1d147dfc9172fa8fd88dcf28f | test/integration/base-suite/ResultsFixture.js | test/integration/base-suite/ResultsFixture.js | PRESTATION_TO_TEST_POSITION = 1
PRESTATION_NAME = /Prime d’activité/
PRESTATION_DESCRIPTION = /revenus modestes/
HAS_PRESTATION_WARNING = false
HAS_PRESTATION_ENGAGER_DEMARCHES_LINK = true
| PRESTATION_TO_TEST_POSITION = 1
PRESTATION_NAME = /Prime d’activité/
PRESTATION_DESCRIPTION = /complète les revenus/
HAS_PRESTATION_WARNING = false
HAS_PRESTATION_ENGAGER_DEMARCHES_LINK = true
| Fix expected text in integration test | Fix expected text in integration test
* Rewording in f456e6416 was not replicated in tests
| JavaScript | agpl-3.0 | sgmap/mes-aides-ui,sgmap/mes-aides-ui,sgmap/mes-aides-ui,sgmap/mes-aides-ui | javascript | ## Code Before:
PRESTATION_TO_TEST_POSITION = 1
PRESTATION_NAME = /Prime d’activité/
PRESTATION_DESCRIPTION = /revenus modestes/
HAS_PRESTATION_WARNING = false
HAS_PRESTATION_ENGAGER_DEMARCHES_LINK = true
## Instruction:
Fix expected text in integration test
* Rewording in f456e6416 was not replicated in tests
## Code After:
PRESTATION_TO_TEST_POSITION = 1
PRESTATION_NAME = /Prime d’activité/
PRESTATION_DESCRIPTION = /complète les revenus/
HAS_PRESTATION_WARNING = false
HAS_PRESTATION_ENGAGER_DEMARCHES_LINK = true
|
1fa01033d36414f810af6d96f443a6d8e411280e | lib/juvia_rails/configuration.rb | lib/juvia_rails/configuration.rb | module JuviaRails
class Configuration
attr_accessor :site_key, :comment_order, :server_url
def initialize
@site_key = nil
@server_url = nil
@comment_order = 'latest-first'
end
end
end | module JuviaRails
class Configuration
attr_accessor :site_key, :comment_order, :server_url, :include_css
def initialize
@site_key = nil
@server_url = nil
@comment_order = 'latest-first'
@include_css = nil
end
end
end | Add include_css option to config | Add include_css option to config
| Ruby | mit | theodi/juvia_rails | ruby | ## Code Before:
module JuviaRails
class Configuration
attr_accessor :site_key, :comment_order, :server_url
def initialize
@site_key = nil
@server_url = nil
@comment_order = 'latest-first'
end
end
end
## Instruction:
Add include_css option to config
## Code After:
module JuviaRails
class Configuration
attr_accessor :site_key, :comment_order, :server_url, :include_css
def initialize
@site_key = nil
@server_url = nil
@comment_order = 'latest-first'
@include_css = nil
end
end
end |
d95273bd72f849e0aab2341e04f77f30920e57a8 | app/views/pages/integrate.en.html.erb | app/views/pages/integrate.en.html.erb | <% @title = t(:integrate_the_form_in_your_website) %>
<h2><%= link_to t(:integrate_the_form_in_your_website), integrate_path %></h2>
| <% @title = t(:integrate_the_form_in_your_website) %>
<h2><%= link_to t(:integrate_the_form_in_your_website), integrate_path %></h2>
<p>Embed this form into your website</p>
<p>Your website can actively participate: be a part of the net of blogs and
websites that ask for sending emails to stop the massacre that is being
held in the Sahara.</p>
<p>If you want to embed this form into your website, just create a new
entry in your blog (Wordpress, Drupal, SPIP or any other) explaining the
campaign. Then copy the following
<a href="/resources/htmlformes.txt">HTML code</a> in this new entry (take into
account that you might need to enable "full html" code). It is important
to ask for politeness and clarity in their messages (they are sent from
the user's own email address) in order to increase their effectiveness.
Please keep your blog entry visible during the lenght of the campaign.</p>
| Add Spanish translation by iokese | Add Spanish translation by iokese
Signed-off-by: Quique <da035149b353ae0c01e142faa3dd77c14a65464f@gmail.com>
| HTML+ERB | agpl-3.0 | apardo/sahara,apardo/sahara | html+erb | ## Code Before:
<% @title = t(:integrate_the_form_in_your_website) %>
<h2><%= link_to t(:integrate_the_form_in_your_website), integrate_path %></h2>
## Instruction:
Add Spanish translation by iokese
Signed-off-by: Quique <da035149b353ae0c01e142faa3dd77c14a65464f@gmail.com>
## Code After:
<% @title = t(:integrate_the_form_in_your_website) %>
<h2><%= link_to t(:integrate_the_form_in_your_website), integrate_path %></h2>
<p>Embed this form into your website</p>
<p>Your website can actively participate: be a part of the net of blogs and
websites that ask for sending emails to stop the massacre that is being
held in the Sahara.</p>
<p>If you want to embed this form into your website, just create a new
entry in your blog (Wordpress, Drupal, SPIP or any other) explaining the
campaign. Then copy the following
<a href="/resources/htmlformes.txt">HTML code</a> in this new entry (take into
account that you might need to enable "full html" code). It is important
to ask for politeness and clarity in their messages (they are sent from
the user's own email address) in order to increase their effectiveness.
Please keep your blog entry visible during the lenght of the campaign.</p>
|
36558e28a1bef96c4c7550e90343931f31f628b2 | _temp/hud/src/index.html | _temp/hud/src/index.html | <!doctype html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Glimpse</title>
</head>
<body>
<script src="./main.js" id="__glimpse_hud" data-request-id="1234"></script>
</body>
</html> | <!doctype html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Glimpse</title>
</head>
<body>
<script src="./main.js"
id="__glimpse_hud"
data-request-id="1234"
data-client-template="/glimpse/client/index.html?hash=8a6b2a0a{&requestId,follow,metadataUri}"
data-context-template="/glimpse/context/?contextId={contextId}{&types}"
data-metadata-template="/glimpse/metadata/?hash=8a6b2a0a"></script>
</body>
</html> | Update script tag for hud to have latest template attributes to match server | Update script tag for hud to have latest template attributes to match server
| HTML | unknown | avanderhoorn/Glimpse.Client.Prototype,Glimpse/Glimpse.Client.Prototype,Glimpse/Glimpse.Client.Prototype,Glimpse/Glimpse.Client.Prototype,avanderhoorn/Glimpse.Client.Prototype | html | ## Code Before:
<!doctype html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Glimpse</title>
</head>
<body>
<script src="./main.js" id="__glimpse_hud" data-request-id="1234"></script>
</body>
</html>
## Instruction:
Update script tag for hud to have latest template attributes to match server
## Code After:
<!doctype html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Glimpse</title>
</head>
<body>
<script src="./main.js"
id="__glimpse_hud"
data-request-id="1234"
data-client-template="/glimpse/client/index.html?hash=8a6b2a0a{&requestId,follow,metadataUri}"
data-context-template="/glimpse/context/?contextId={contextId}{&types}"
data-metadata-template="/glimpse/metadata/?hash=8a6b2a0a"></script>
</body>
</html> |
2b87bbc4fe04e70a5a75702cafe986f1493daac7 | app/synchronizers/solidus_mailchimp_sync/product_synchronizer.rb | app/synchronizers/solidus_mailchimp_sync/product_synchronizer.rb | module SolidusMailchimpSync
class ProductSynchronizer < BaseSynchronizer
self.serializer_class_name = "::SolidusMailchimpSync::ProductSerializer"
self.synced_attributes = %w{name description slug available_on}
# Since Mailchimp API 3.0 doesn't let us update products, important to wait
# until product is really ready to sync it the first time.
class_attribute :only_auto_sync_if
self.only_auto_sync_if = lambda { |p| p.available? }
def can_sync?
only_auto_sync_if.call(model) && super
end
def sync
# We go ahead and try to create it. If it already existed, mailchimp
# doesn't let us do an update, but we can update all variants.
post
rescue SolidusMailchimpSync::Error => e
if e.status == 400 && e.detail =~ /already exists/
sync_all_variants
else
raise e
end
end
def path
"/products/#{CGI.escape mailchimp_id}"
end
def create_path
"/products"
end
def mailchimp_id
self.class.product_id(model)
end
def self.product_id(product)
serializer_class_name.constantize.new(product).as_json[:id]
end
def sync_all_variants
model.variants_including_master.collect do |variant|
VariantSynchronizer.new(variant).sync
end
end
end
end
| module SolidusMailchimpSync
class ProductSynchronizer < BaseSynchronizer
self.serializer_class_name = "::SolidusMailchimpSync::ProductSerializer"
self.synced_attributes = %w{name description slug available_on}
# Since Mailchimp API 3.0 doesn't let us update products, important to wait
# until product is really ready to sync it the first time.
class_attribute :only_auto_sync_if
self.only_auto_sync_if = lambda { |p| p.available? }
def can_sync?
only_auto_sync_if.call(model) && super
end
def sync
post
rescue SolidusMailchimpSync::Error => e
if e.status == 400 && e.detail =~ /already exists/
patch
else
raise e
end
end
def path
"/products/#{CGI.escape mailchimp_id}"
end
def create_path
"/products"
end
def mailchimp_id
self.class.product_id(model)
end
def self.product_id(product)
serializer_class_name.constantize.new(product).as_json[:id]
end
def sync_all_variants
model.variants_including_master.collect do |variant|
VariantSynchronizer.new(variant).sync
end
end
end
end
| Allow patch for updating products | Allow patch for updating products
| Ruby | bsd-3-clause | LaLigne/solidus_mailchimp_sync,LaLigne/solidus_mailchimp_sync | ruby | ## Code Before:
module SolidusMailchimpSync
class ProductSynchronizer < BaseSynchronizer
self.serializer_class_name = "::SolidusMailchimpSync::ProductSerializer"
self.synced_attributes = %w{name description slug available_on}
# Since Mailchimp API 3.0 doesn't let us update products, important to wait
# until product is really ready to sync it the first time.
class_attribute :only_auto_sync_if
self.only_auto_sync_if = lambda { |p| p.available? }
def can_sync?
only_auto_sync_if.call(model) && super
end
def sync
# We go ahead and try to create it. If it already existed, mailchimp
# doesn't let us do an update, but we can update all variants.
post
rescue SolidusMailchimpSync::Error => e
if e.status == 400 && e.detail =~ /already exists/
sync_all_variants
else
raise e
end
end
def path
"/products/#{CGI.escape mailchimp_id}"
end
def create_path
"/products"
end
def mailchimp_id
self.class.product_id(model)
end
def self.product_id(product)
serializer_class_name.constantize.new(product).as_json[:id]
end
def sync_all_variants
model.variants_including_master.collect do |variant|
VariantSynchronizer.new(variant).sync
end
end
end
end
## Instruction:
Allow patch for updating products
## Code After:
module SolidusMailchimpSync
class ProductSynchronizer < BaseSynchronizer
self.serializer_class_name = "::SolidusMailchimpSync::ProductSerializer"
self.synced_attributes = %w{name description slug available_on}
# Since Mailchimp API 3.0 doesn't let us update products, important to wait
# until product is really ready to sync it the first time.
class_attribute :only_auto_sync_if
self.only_auto_sync_if = lambda { |p| p.available? }
def can_sync?
only_auto_sync_if.call(model) && super
end
def sync
post
rescue SolidusMailchimpSync::Error => e
if e.status == 400 && e.detail =~ /already exists/
patch
else
raise e
end
end
def path
"/products/#{CGI.escape mailchimp_id}"
end
def create_path
"/products"
end
def mailchimp_id
self.class.product_id(model)
end
def self.product_id(product)
serializer_class_name.constantize.new(product).as_json[:id]
end
def sync_all_variants
model.variants_including_master.collect do |variant|
VariantSynchronizer.new(variant).sync
end
end
end
end
|
f3bdab63a999c2102b16bc7c9c536b4a8e212b53 | .gitlab-ci.yml | .gitlab-ci.yml | variables:
GIT_STRATEGY: fetch
GIT_SUBMODULE_STRATEGY: normal
NODE_OPTIONS: --no-warnings
stages:
- test
tests:
stage: test
tags:
- windows
before_script:
- Import-Module "$env:ChocolateyInstall\helpers\chocolateyProfile.psm1"
- choco install -y yarn
- refreshenv
script:
- yarn
- yarn jest --testTimeout 30000
| variables:
GIT_STRATEGY: fetch
GIT_CHECKOUT: "false"
GIT_SUBMODULE_STRATEGY: normal
NODE_OPTIONS: --no-warnings
stages:
- test
tests:
stage: test
tags:
- windows
before_script:
- Import-Module "$env:ChocolateyInstall\helpers\chocolateyProfile.psm1"
- choco install -y yarn
- refreshenv
script:
- git config core.symlinks true
- git checkout $CI_COMMIT_REF_NAME
- git submodule update --init --depth 1
- yarn
- yarn jest --testTimeout 30000
| Test a fix for Windows CI issues with symlinks | Test a fix for Windows CI issues with symlinks
| YAML | mit | Squishymedia/BIDS-Validator,nellh/bids-validator,nellh/bids-validator,nellh/bids-validator,Squishymedia/bids-validator | yaml | ## Code Before:
variables:
GIT_STRATEGY: fetch
GIT_SUBMODULE_STRATEGY: normal
NODE_OPTIONS: --no-warnings
stages:
- test
tests:
stage: test
tags:
- windows
before_script:
- Import-Module "$env:ChocolateyInstall\helpers\chocolateyProfile.psm1"
- choco install -y yarn
- refreshenv
script:
- yarn
- yarn jest --testTimeout 30000
## Instruction:
Test a fix for Windows CI issues with symlinks
## Code After:
variables:
GIT_STRATEGY: fetch
GIT_CHECKOUT: "false"
GIT_SUBMODULE_STRATEGY: normal
NODE_OPTIONS: --no-warnings
stages:
- test
tests:
stage: test
tags:
- windows
before_script:
- Import-Module "$env:ChocolateyInstall\helpers\chocolateyProfile.psm1"
- choco install -y yarn
- refreshenv
script:
- git config core.symlinks true
- git checkout $CI_COMMIT_REF_NAME
- git submodule update --init --depth 1
- yarn
- yarn jest --testTimeout 30000
|
90fa23d1d1b2497d65507b7930323b118f512a25 | disco_aws_automation/disco_acm.py | disco_aws_automation/disco_acm.py | import logging
import boto3
import botocore
class DiscoACM(object):
"""
A class to manage the Amazon Certificate Service
"""
def __init__(self, connection=None):
self._acm = connection
@property
def acm(self):
"""
Lazily creates ACM connection
NOTE!!! As of 2016-02-11 ACM is not supported outside the us-east-1 region.
Return None if service does not exist in current region
"""
if not self._acm:
try:
self._acm = boto3.client('acm', region_name='us-east-1')
except Exception:
logging.warning("ACM service does not exist in current region")
return None
return self._acm
def get_certificate_arn(self, dns_name):
"""Returns a Certificate ARN from the Amazon Certificate Service given the DNS name"""
if not self.acm:
return None
try:
certs = self.acm.list_certificates()["CertificateSummaryList"]
cert = [cert['CertificateArn'] for cert in certs if cert['DomainName'] == dns_name]
return cert[0] if cert else None
except (botocore.exceptions.EndpointConnectionError, botocore.vendored.requests.exceptions.ConnectionError):
# some versions of botocore(1.3.26) will try to connect to acm even if outside us-east-1
return None
| import logging
import boto3
import botocore
class DiscoACM(object):
"""
A class to manage the Amazon Certificate Service
"""
def __init__(self, connection=None):
self._acm = connection
@property
def acm(self):
"""
Lazily creates ACM connection
NOTE!!! As of 2016-02-11 ACM is not supported outside the us-east-1 region.
Return None if service does not exist in current region
"""
if not self._acm:
try:
self._acm = boto3.client('acm', region_name='us-east-1')
except Exception:
logging.warning("ACM service does not exist in current region")
return None
return self._acm
def get_certificate_arn(self, dns_name):
"""Returns a Certificate ARN from the Amazon Certificate Service given the DNS name"""
if not self.acm:
return None
try:
certs = self.acm.list_certificates()["CertificateSummaryList"]
cert = [cert['CertificateArn'] for cert in certs if cert['DomainName'] == dns_name]
return cert[0] if cert else None
except botocore.exceptions.EndpointConnectionError:
# some versions of botocore(1.3.26) will try to connect to acm even if outside us-east-1
return None
| Revert "Swallow proxy exception from requests" | Revert "Swallow proxy exception from requests"
This reverts commit 8d9ccbb2bbde7c2f8dbe60b90f730d87b924d86e.
| Python | bsd-2-clause | amplifylitco/asiaq,amplifylitco/asiaq,amplifylitco/asiaq | python | ## Code Before:
import logging
import boto3
import botocore
class DiscoACM(object):
"""
A class to manage the Amazon Certificate Service
"""
def __init__(self, connection=None):
self._acm = connection
@property
def acm(self):
"""
Lazily creates ACM connection
NOTE!!! As of 2016-02-11 ACM is not supported outside the us-east-1 region.
Return None if service does not exist in current region
"""
if not self._acm:
try:
self._acm = boto3.client('acm', region_name='us-east-1')
except Exception:
logging.warning("ACM service does not exist in current region")
return None
return self._acm
def get_certificate_arn(self, dns_name):
"""Returns a Certificate ARN from the Amazon Certificate Service given the DNS name"""
if not self.acm:
return None
try:
certs = self.acm.list_certificates()["CertificateSummaryList"]
cert = [cert['CertificateArn'] for cert in certs if cert['DomainName'] == dns_name]
return cert[0] if cert else None
except (botocore.exceptions.EndpointConnectionError, botocore.vendored.requests.exceptions.ConnectionError):
# some versions of botocore(1.3.26) will try to connect to acm even if outside us-east-1
return None
## Instruction:
Revert "Swallow proxy exception from requests"
This reverts commit 8d9ccbb2bbde7c2f8dbe60b90f730d87b924d86e.
## Code After:
import logging
import boto3
import botocore
class DiscoACM(object):
"""
A class to manage the Amazon Certificate Service
"""
def __init__(self, connection=None):
self._acm = connection
@property
def acm(self):
"""
Lazily creates ACM connection
NOTE!!! As of 2016-02-11 ACM is not supported outside the us-east-1 region.
Return None if service does not exist in current region
"""
if not self._acm:
try:
self._acm = boto3.client('acm', region_name='us-east-1')
except Exception:
logging.warning("ACM service does not exist in current region")
return None
return self._acm
def get_certificate_arn(self, dns_name):
"""Returns a Certificate ARN from the Amazon Certificate Service given the DNS name"""
if not self.acm:
return None
try:
certs = self.acm.list_certificates()["CertificateSummaryList"]
cert = [cert['CertificateArn'] for cert in certs if cert['DomainName'] == dns_name]
return cert[0] if cert else None
except botocore.exceptions.EndpointConnectionError:
# some versions of botocore(1.3.26) will try to connect to acm even if outside us-east-1
return None
|
5bd5f99315a1c6f705034f8a0bf2d06b4060d167 | lib/ffi/io.rb | lib/ffi/io.rb | module FFI
module IO
def self.for_fd(fd, mode = "r")
::IO.for_fd(fd, mode)
end
end
end | module FFI
module IO
def self.for_fd(fd, mode = "r")
::IO.for_fd(fd, mode)
end
#
# A version of IO#read that reads into a native buffer
#
# This will be optimized at some future time to eliminate the double copy
#
def self.read(io, buf, len)
tmp = io.read(len)
return -1 unless tmp
buf.put_bytes(0, tmp)
tmp.length
end
end
end
| Add FFI::IO.read as described in JRUBY-3636 | Add FFI::IO.read as described in JRUBY-3636
| Ruby | bsd-3-clause | yghannam/ffi,MikaelSmith/ffi,mvz/ffi,tduehr/ffi,sparkchaser/ffi,yghannam/ffi,majioa/ffi,ferventcoder/ffi,yghannam/ffi,mvz/ffi,majioa/ffi,mvz/ffi,MikaelSmith/ffi,majioa/ffi,tduehr/ffi,MikaelSmith/ffi,mvz/ffi,yghannam/ffi,ferventcoder/ffi,ferventcoder/ffi,yghannam/ffi,ffi/ffi,tduehr/ffi,ferventcoder/ffi,sparkchaser/ffi,ffi/ffi,sparkchaser/ffi,MikaelSmith/ffi,tduehr/ffi,ffi/ffi,majioa/ffi,sparkchaser/ffi | ruby | ## Code Before:
module FFI
module IO
def self.for_fd(fd, mode = "r")
::IO.for_fd(fd, mode)
end
end
end
## Instruction:
Add FFI::IO.read as described in JRUBY-3636
## Code After:
module FFI
module IO
def self.for_fd(fd, mode = "r")
::IO.for_fd(fd, mode)
end
#
# A version of IO#read that reads into a native buffer
#
# This will be optimized at some future time to eliminate the double copy
#
def self.read(io, buf, len)
tmp = io.read(len)
return -1 unless tmp
buf.put_bytes(0, tmp)
tmp.length
end
end
end
|
4b4156b66f5a41ef7451cf1c39fa7c948e363406 | lib/Settings/readme.md | lib/Settings/readme.md |
Displays the settings page for a Stripes module, given a list of the sub-pages to link and route to.
## Usage
```
import React from 'react';
import Settings from './Settings';
import PermissionSets from './PermissionSets';
import PatronGroupsSettings from './PatronGroupsSettings';
const pages = [
{ route: 'perms', label: 'Permission sets', component: PermissionSets, perm: 'perms.permissions.get' },
{ route: 'groups', label: 'Patron groups', component: PatronGroupsSettings },
];
export default props => <Settings {...props} pages={pages} />;
```
## Properties
The following properties are supported:
* `stripes`: the Stripes object _must_ be passed through from the caller.
* `pages`: the list of sub-pages to be linked from the settings page. Each member of the list is an object with the following members:
* `route`: the route, relative to that of the settings page, on which the sub-page should be found.
* `label`: the human-readable label that, when clicked on, links to the specified route.
* `component`: the component that is rendered at the specified route.
* `perm`: if specified, the name of a permission which the current user must have in order to access the page; if the user lacks the permission, then the link is not provided. (If omitted, then no permission check is performed for the sub-page.)
|
Displays the settings page for a Stripes module, given a list of the sub-pages to link and route to.
## Usage
```
import React from 'react';
import Settings from './Settings';
import PermissionSets from './PermissionSets';
import PatronGroupsSettings from './PatronGroupsSettings';
const pages = [
{ route: 'perms', label: 'Permission sets', component: PermissionSets, perm: 'perms.permissions.get' },
{ route: 'groups', label: 'Patron groups', component: PatronGroupsSettings },
];
export default props => <Settings {...props} pages={pages} />;
```
## Properties
The following properties are supported:
* `pages`: the list of sub-pages to be linked from the settings page. Each member of the list is an object with the following members:
* `route`: the route, relative to that of the settings page, on which the sub-page should be found.
* `label`: the human-readable label that, when clicked on, links to the specified route.
* `component`: the component that is rendered at the specified route.
* `perm`: if specified, the name of a permission which the current user must have in order to access the page; if the user lacks the permission, then the link is not provided. (If omitted, then no permission check is performed for the sub-page.)
| Remove claim in docs that <Settings> requires the "stripes" prop | Remove claim in docs that <Settings> requires the "stripes" prop
| Markdown | apache-2.0 | folio-org/stripes-components,folio-org/stripes-components | markdown | ## Code Before:
Displays the settings page for a Stripes module, given a list of the sub-pages to link and route to.
## Usage
```
import React from 'react';
import Settings from './Settings';
import PermissionSets from './PermissionSets';
import PatronGroupsSettings from './PatronGroupsSettings';
const pages = [
{ route: 'perms', label: 'Permission sets', component: PermissionSets, perm: 'perms.permissions.get' },
{ route: 'groups', label: 'Patron groups', component: PatronGroupsSettings },
];
export default props => <Settings {...props} pages={pages} />;
```
## Properties
The following properties are supported:
* `stripes`: the Stripes object _must_ be passed through from the caller.
* `pages`: the list of sub-pages to be linked from the settings page. Each member of the list is an object with the following members:
* `route`: the route, relative to that of the settings page, on which the sub-page should be found.
* `label`: the human-readable label that, when clicked on, links to the specified route.
* `component`: the component that is rendered at the specified route.
* `perm`: if specified, the name of a permission which the current user must have in order to access the page; if the user lacks the permission, then the link is not provided. (If omitted, then no permission check is performed for the sub-page.)
## Instruction:
Remove claim in docs that <Settings> requires the "stripes" prop
## Code After:
Displays the settings page for a Stripes module, given a list of the sub-pages to link and route to.
## Usage
```
import React from 'react';
import Settings from './Settings';
import PermissionSets from './PermissionSets';
import PatronGroupsSettings from './PatronGroupsSettings';
const pages = [
{ route: 'perms', label: 'Permission sets', component: PermissionSets, perm: 'perms.permissions.get' },
{ route: 'groups', label: 'Patron groups', component: PatronGroupsSettings },
];
export default props => <Settings {...props} pages={pages} />;
```
## Properties
The following properties are supported:
* `pages`: the list of sub-pages to be linked from the settings page. Each member of the list is an object with the following members:
* `route`: the route, relative to that of the settings page, on which the sub-page should be found.
* `label`: the human-readable label that, when clicked on, links to the specified route.
* `component`: the component that is rendered at the specified route.
* `perm`: if specified, the name of a permission which the current user must have in order to access the page; if the user lacks the permission, then the link is not provided. (If omitted, then no permission check is performed for the sub-page.)
|
a162a996d9eafa168f0bf321df9bc2b6996fb9d3 | src/test/java/com/gmail/cs475x/anybar4j/AnyBar4jTest.java | src/test/java/com/gmail/cs475x/anybar4j/AnyBar4jTest.java | package com.gmail.cs475x.anybar4j;
import static org.junit.Assert.assertEquals;
import org.junit.Test;
import com.gmail.cs475x.anybar4j.AnyBar4j.AnyBarImage;
public class AnyBar4jTest {
@Test
public void shouldWorkWithDefaultHostAndPort() {
Exception exception = null;
try {
AnyBar4j anybar = new AnyBar4j(AnyBar4j.DEFAULT_HOST, AnyBar4j.DEFAULT_PORT);
anybar.setImage(AnyBarImage.GREEN);
anybar.close();
} catch (Exception e) {
exception = e;
}
assertEquals(null, exception);
}
@Test
public void shouldUseDefaultPortIfSuppliedPortIsLessThanOrEqualToZero() {
AnyBar4j anybar = null;
Exception exception = null;
try {
anybar = new AnyBar4j(AnyBar4j.DEFAULT_HOST, -1);
anybar.close();
} catch (Exception e) {
exception = e;
}
assertEquals(null, exception);
assertEquals(AnyBar4j.DEFAULT_PORT, anybar.port);
}
}
| package com.gmail.cs475x.anybar4j;
import static org.junit.Assert.assertEquals;
import org.junit.Test;
import com.gmail.cs475x.anybar4j.AnyBar4j.AnyBarImage;
public class AnyBar4jTest {
@Test
public void shouldWorkWithDefaultHostAndPort() {
AnyBar4j anybar = null;
Exception exception = null;
try {
anybar = new AnyBar4j(AnyBar4j.DEFAULT_HOST, AnyBar4j.DEFAULT_PORT);
anybar.setImage(AnyBarImage.GREEN);
} catch (Exception e) {
exception = e;
} finally {
if (anybar != null) {
anybar.close();
}
}
assertEquals(null, exception);
}
@Test
public void shouldUseDefaultPortIfSuppliedPortIsLessThanOrEqualToZero() {
AnyBar4j anybar = null;
Exception exception = null;
try {
anybar = new AnyBar4j(AnyBar4j.DEFAULT_HOST, -1);
} catch (Exception e) {
exception = e;
} finally {
if (anybar != null) {
anybar.close();
}
}
assertEquals(null, exception);
assertEquals(AnyBar4j.DEFAULT_PORT, anybar.port);
}
}
| Move `close()` calls to finally block | Move `close()` calls to finally block
| Java | mit | cs475x/AnyBar4j | java | ## Code Before:
package com.gmail.cs475x.anybar4j;
import static org.junit.Assert.assertEquals;
import org.junit.Test;
import com.gmail.cs475x.anybar4j.AnyBar4j.AnyBarImage;
public class AnyBar4jTest {
@Test
public void shouldWorkWithDefaultHostAndPort() {
Exception exception = null;
try {
AnyBar4j anybar = new AnyBar4j(AnyBar4j.DEFAULT_HOST, AnyBar4j.DEFAULT_PORT);
anybar.setImage(AnyBarImage.GREEN);
anybar.close();
} catch (Exception e) {
exception = e;
}
assertEquals(null, exception);
}
@Test
public void shouldUseDefaultPortIfSuppliedPortIsLessThanOrEqualToZero() {
AnyBar4j anybar = null;
Exception exception = null;
try {
anybar = new AnyBar4j(AnyBar4j.DEFAULT_HOST, -1);
anybar.close();
} catch (Exception e) {
exception = e;
}
assertEquals(null, exception);
assertEquals(AnyBar4j.DEFAULT_PORT, anybar.port);
}
}
## Instruction:
Move `close()` calls to finally block
## Code After:
package com.gmail.cs475x.anybar4j;
import static org.junit.Assert.assertEquals;
import org.junit.Test;
import com.gmail.cs475x.anybar4j.AnyBar4j.AnyBarImage;
public class AnyBar4jTest {
@Test
public void shouldWorkWithDefaultHostAndPort() {
AnyBar4j anybar = null;
Exception exception = null;
try {
anybar = new AnyBar4j(AnyBar4j.DEFAULT_HOST, AnyBar4j.DEFAULT_PORT);
anybar.setImage(AnyBarImage.GREEN);
} catch (Exception e) {
exception = e;
} finally {
if (anybar != null) {
anybar.close();
}
}
assertEquals(null, exception);
}
@Test
public void shouldUseDefaultPortIfSuppliedPortIsLessThanOrEqualToZero() {
AnyBar4j anybar = null;
Exception exception = null;
try {
anybar = new AnyBar4j(AnyBar4j.DEFAULT_HOST, -1);
} catch (Exception e) {
exception = e;
} finally {
if (anybar != null) {
anybar.close();
}
}
assertEquals(null, exception);
assertEquals(AnyBar4j.DEFAULT_PORT, anybar.port);
}
}
|
747c7bc260aab88d8520e410a8554bde63f25a24 | app/views/noisy_workflow/request_fact_check.text.erb | app/views/noisy_workflow/request_fact_check.text.erb | Hello,
The following piece of mainstream GOV.UK content is ready to be checked for factual accuracy:
Title: <%= @edition.title %>
<%= "#{Plek.current.find("private-frontend")}/#{@edition.slug}?edition=#{@edition.version_number}" %>
You will need your GOV.UK user name and password to preview the content.
If you or your colleagues spot something that's factually incorrect please reply to this email with comments (and where there are no inaccuracies, please reply indicating so):
- directing us to any specific instances, e.g. part 2, subheading 3
- explaining what the problem is, e.g. the text implies a legal obligation, when there isn't one
Please don't rewrite the text - only point out any factual errors. We will rewrite accordingly.
Please also ensure that your comments are in plain text and sent in the body of the reply email.
Attachments are not supported and will not be logged in our system - therefore no attachments please.
Please respond within five working days. If you may not be able to meet this deadline let us know as soon as possible.
PLEASE SEND COMMENTS TO THIS EMAIL ADDRESS:
<%= @edition.fact_check_email_address %>
This will ensure they are logged in the right place in our system.
If you've forgotten your username and password, please email govuk-feedback@digital.cabinet-office.gov.uk.
Many thanks,
GOV.UK editorial team
| Hello.
The following piece of mainstream GOV.UK content needs to be fact checked because
Title: <%= @edition.title %>
<%= "#{Plek.current.find("private-frontend")}/#{@edition.slug}?edition=#{@edition.version_number}" %>
You'll need your GOV.UK user name and password to view the content.
Please reply to this email and either:
- confirm that the content is correct
- provide clear specific details of what's incorrect, why and what changes should be made
Only include factual errors - we'll rewrite any content as needed.
You must make sure:
- you only send 1 reply
- your reply is in plain text (eg no colours, italics or highlighted text)
- you don't send any attachments
Please reply within 5 working days. Let us know as soon as possible if you're not able to meet this deadline.
Email govuk-feedback@digital.cabinet-office.gov.uk if you've forgotten your username and password.
Many thanks,
GOV.UK mainstream content team.
| Clean up email to match GOV.UK style and clarity | Clean up email to match GOV.UK style and clarity
Number of amends to make the email cleaner and simpler, hopefully providing clearer instructions to fact checkers. | HTML+ERB | mit | leftees/publisher,leftees/publisher,alphagov/publisher,telekomatrix/publisher,leftees/publisher,telekomatrix/publisher,telekomatrix/publisher,leftees/publisher,alphagov/publisher,telekomatrix/publisher,alphagov/publisher | html+erb | ## Code Before:
Hello,
The following piece of mainstream GOV.UK content is ready to be checked for factual accuracy:
Title: <%= @edition.title %>
<%= "#{Plek.current.find("private-frontend")}/#{@edition.slug}?edition=#{@edition.version_number}" %>
You will need your GOV.UK user name and password to preview the content.
If you or your colleagues spot something that's factually incorrect please reply to this email with comments (and where there are no inaccuracies, please reply indicating so):
- directing us to any specific instances, e.g. part 2, subheading 3
- explaining what the problem is, e.g. the text implies a legal obligation, when there isn't one
Please don't rewrite the text - only point out any factual errors. We will rewrite accordingly.
Please also ensure that your comments are in plain text and sent in the body of the reply email.
Attachments are not supported and will not be logged in our system - therefore no attachments please.
Please respond within five working days. If you may not be able to meet this deadline let us know as soon as possible.
PLEASE SEND COMMENTS TO THIS EMAIL ADDRESS:
<%= @edition.fact_check_email_address %>
This will ensure they are logged in the right place in our system.
If you've forgotten your username and password, please email govuk-feedback@digital.cabinet-office.gov.uk.
Many thanks,
GOV.UK editorial team
## Instruction:
Clean up email to match GOV.UK style and clarity
Number of amends to make the email cleaner and simpler, hopefully providing clearer instructions to fact checkers.
## Code After:
Hello.
The following piece of mainstream GOV.UK content needs to be fact checked because
Title: <%= @edition.title %>
<%= "#{Plek.current.find("private-frontend")}/#{@edition.slug}?edition=#{@edition.version_number}" %>
You'll need your GOV.UK user name and password to view the content.
Please reply to this email and either:
- confirm that the content is correct
- provide clear specific details of what's incorrect, why and what changes should be made
Only include factual errors - we'll rewrite any content as needed.
You must make sure:
- you only send 1 reply
- your reply is in plain text (eg no colours, italics or highlighted text)
- you don't send any attachments
Please reply within 5 working days. Let us know as soon as possible if you're not able to meet this deadline.
Email govuk-feedback@digital.cabinet-office.gov.uk if you've forgotten your username and password.
Many thanks,
GOV.UK mainstream content team.
|
c6855fa87770e65fb0a6986386ce97c894a554c4 | tests/foundation/class-foundation-test.php | tests/foundation/class-foundation-test.php | <?php
namespace Xu\Tests\Foundation;
use Xu\Foundation\Foundation;
class Foundation_Test extends \WP_UnitTestCase {
public function test_component() {
$this->assertTrue( xu( '' ) instanceof Foundation );
$this->assertEquals( Foundation::VERSION, xu( 'xu' )->version() );
}
public function test_fn_method() {
$this->assertEquals( 'xu-dashify', \xu()->fn( 'xu_dashify', ['xu_dashify'] ) );
$this->assertEquals( 'xu-dashify', \xu()->fn( 'dashify', 'xu_dashify' ) );
}
public function test_component_method() {
try {
\xu()->component( null );
} catch ( \Exception $e ) {
$this->assertEquals( 'Invalid argument. `$component` must be string.', $e->getMessage() );
}
try {
\xu()->component( 'frozzare.tank.container' );
} catch ( \Exception $e ) {
$this->assertEquals( '`Xu\\Components\\Frozzare\\Tank\\Container` class does not exists.', $e->getMessage() );
}
try {
\xu()->component( 'test' );
} catch ( \Exception $e ) {
$this->assertEquals( '`Xu\\Components\\Test\\Test` class is not a instance of Xu\\Components\\Component.', $e->getMessage() );
}
}
}
| <?php
namespace Xu\Tests\Foundation;
use Xu\Foundation\Foundation;
class Foundation_Test extends \WP_UnitTestCase {
public function test_component() {
$this->assertTrue( xu( '' ) instanceof Foundation );
$this->assertEquals( Foundation::VERSION, xu( 'xu' )->version() );
}
public function test_fn_method() {
$this->assertEquals( 'xu-dashify', \xu()->fn( 'xu_dashify', ['xu_dashify'] ) );
$this->assertEquals( 'xu-dashify', \xu()->fn( 'dashify', 'xu_dashify' ) );
}
public function test_component_method() {
try {
\xu()->component( null );
} catch ( \Exception $e ) {
$this->assertEquals( 'Invalid argument. `$component` must be string.', $e->getMessage() );
}
try {
\xu()->component( 'frozzare.tank.container' );
} catch ( \Exception $e ) {
$this->assertEquals( '`Xu\\Components\\Frozzare\\Tank\\Container` class does not exists.', $e->getMessage() );
}
try {
\xu()->component( 'test' );
} catch ( \Exception $e ) {
$this->assertEquals( '`Xu\\Components\\Test\\Test` class is not a instance of Xu\\Components\\Component.', $e->getMessage() );
}
try {
\xu()->component( 'Test\\Test' );
} catch ( \Exception $e ) {
$this->assertEquals( '`Xu\\Components\\Test\\Test` class is not a instance of Xu\\Components\\Component.', $e->getMessage() );
}
}
}
| Add one more component test | Add one more component test
| PHP | mit | wp-xu/framework,wp-xu/xu | php | ## Code Before:
<?php
namespace Xu\Tests\Foundation;
use Xu\Foundation\Foundation;
class Foundation_Test extends \WP_UnitTestCase {
public function test_component() {
$this->assertTrue( xu( '' ) instanceof Foundation );
$this->assertEquals( Foundation::VERSION, xu( 'xu' )->version() );
}
public function test_fn_method() {
$this->assertEquals( 'xu-dashify', \xu()->fn( 'xu_dashify', ['xu_dashify'] ) );
$this->assertEquals( 'xu-dashify', \xu()->fn( 'dashify', 'xu_dashify' ) );
}
public function test_component_method() {
try {
\xu()->component( null );
} catch ( \Exception $e ) {
$this->assertEquals( 'Invalid argument. `$component` must be string.', $e->getMessage() );
}
try {
\xu()->component( 'frozzare.tank.container' );
} catch ( \Exception $e ) {
$this->assertEquals( '`Xu\\Components\\Frozzare\\Tank\\Container` class does not exists.', $e->getMessage() );
}
try {
\xu()->component( 'test' );
} catch ( \Exception $e ) {
$this->assertEquals( '`Xu\\Components\\Test\\Test` class is not a instance of Xu\\Components\\Component.', $e->getMessage() );
}
}
}
## Instruction:
Add one more component test
## Code After:
<?php
namespace Xu\Tests\Foundation;
use Xu\Foundation\Foundation;
class Foundation_Test extends \WP_UnitTestCase {
public function test_component() {
$this->assertTrue( xu( '' ) instanceof Foundation );
$this->assertEquals( Foundation::VERSION, xu( 'xu' )->version() );
}
public function test_fn_method() {
$this->assertEquals( 'xu-dashify', \xu()->fn( 'xu_dashify', ['xu_dashify'] ) );
$this->assertEquals( 'xu-dashify', \xu()->fn( 'dashify', 'xu_dashify' ) );
}
public function test_component_method() {
try {
\xu()->component( null );
} catch ( \Exception $e ) {
$this->assertEquals( 'Invalid argument. `$component` must be string.', $e->getMessage() );
}
try {
\xu()->component( 'frozzare.tank.container' );
} catch ( \Exception $e ) {
$this->assertEquals( '`Xu\\Components\\Frozzare\\Tank\\Container` class does not exists.', $e->getMessage() );
}
try {
\xu()->component( 'test' );
} catch ( \Exception $e ) {
$this->assertEquals( '`Xu\\Components\\Test\\Test` class is not a instance of Xu\\Components\\Component.', $e->getMessage() );
}
try {
\xu()->component( 'Test\\Test' );
} catch ( \Exception $e ) {
$this->assertEquals( '`Xu\\Components\\Test\\Test` class is not a instance of Xu\\Components\\Component.', $e->getMessage() );
}
}
}
|
9fbfc4c13b8ffc90b4d4df57c0e2028d29a7eb0b | run.sh | run.sh |
export WEPLAY_PORT 3000
export WEPLAY_IO_URL "http://159.203.3.193:3001"
export WEPLAY_REDIS "159.203.3.193:6379"
forever start /srv/weplay-web/index.js
tail -f root/.forever/*.log
|
WEPLAY_PORT=3000 WEPLAY_IO_URL="http://159.203.3.193:3001" WEPLAY_REDIS="159.203.3.193:6379" forever /srv/weplay-web/index.js
| Fix export, add environ to forever | Fix export, add environ to forever
| Shell | mit | imdjh/letweplay | shell | ## Code Before:
export WEPLAY_PORT 3000
export WEPLAY_IO_URL "http://159.203.3.193:3001"
export WEPLAY_REDIS "159.203.3.193:6379"
forever start /srv/weplay-web/index.js
tail -f root/.forever/*.log
## Instruction:
Fix export, add environ to forever
## Code After:
WEPLAY_PORT=3000 WEPLAY_IO_URL="http://159.203.3.193:3001" WEPLAY_REDIS="159.203.3.193:6379" forever /srv/weplay-web/index.js
|
3eaac1028f57458c4ef79df9655f3f4c088f99a6 | .github/workflows/github-pages.yml | .github/workflows/github-pages.yml | name: Build and deploy site to GitHub Pages
on:
push:
branches:
- master
jobs:
jekyll:
runs-on: ubuntu-latest
steps:
- name: Checkout the code
uses: actions/checkout@master
- name: Use GitHub Actions cache
uses: actions/cache@master
with:
path: vendor/bundle
key: ${{ runner.os }}-gems-${{ hashFiles('**/Gemfile') }}
restore-keys: |
${{ runner.os }}-gems-
- name: Build the site
uses: helaili/jekyll-action@master
with:
build_only: true
- name: Test the generated HTML for errors
uses: chabad360/htmlproofer@master
with:
directory: "/github/jekyll_build"
- name: Deploy the site
uses: helaili/jekyll-action@master
with:
token: ${{ secrets.GITHUB_TOKEN }}
target_branch: 'gh-pages' | name: Build and deploy site to GitHub Pages
on:
push:
branches:
- master
jobs:
jekyll:
runs-on: ubuntu-latest
steps:
- name: Checkout the code
uses: actions/checkout@master
- name: Use GitHub Actions cache
uses: actions/cache@master
with:
path: vendor/bundle
key: ${{ runner.os }}-gems-${{ hashFiles('**/Gemfile') }}
restore-keys: |
${{ runner.os }}-gems-
- name: Deploy the site
uses: helaili/jekyll-action@master
with:
token: ${{ secrets.GITHUB_TOKEN }}
target_branch: 'gh-pages' | Remove htmlproofer for now, going to add it back later | Remove htmlproofer for now, going to add it back later
| YAML | agpl-3.0 | CorruptComputer/CorruptComputer.github.io,CorruptComputer/CorruptComputer.github.io | yaml | ## Code Before:
name: Build and deploy site to GitHub Pages
on:
push:
branches:
- master
jobs:
jekyll:
runs-on: ubuntu-latest
steps:
- name: Checkout the code
uses: actions/checkout@master
- name: Use GitHub Actions cache
uses: actions/cache@master
with:
path: vendor/bundle
key: ${{ runner.os }}-gems-${{ hashFiles('**/Gemfile') }}
restore-keys: |
${{ runner.os }}-gems-
- name: Build the site
uses: helaili/jekyll-action@master
with:
build_only: true
- name: Test the generated HTML for errors
uses: chabad360/htmlproofer@master
with:
directory: "/github/jekyll_build"
- name: Deploy the site
uses: helaili/jekyll-action@master
with:
token: ${{ secrets.GITHUB_TOKEN }}
target_branch: 'gh-pages'
## Instruction:
Remove htmlproofer for now, going to add it back later
## Code After:
name: Build and deploy site to GitHub Pages
on:
push:
branches:
- master
jobs:
jekyll:
runs-on: ubuntu-latest
steps:
- name: Checkout the code
uses: actions/checkout@master
- name: Use GitHub Actions cache
uses: actions/cache@master
with:
path: vendor/bundle
key: ${{ runner.os }}-gems-${{ hashFiles('**/Gemfile') }}
restore-keys: |
${{ runner.os }}-gems-
- name: Deploy the site
uses: helaili/jekyll-action@master
with:
token: ${{ secrets.GITHUB_TOKEN }}
target_branch: 'gh-pages' |
8f42b814b989441005134d720ab6b3be4b1bb674 | src/Illuminate/Foundation/Auth/VerifiesEmails.php | src/Illuminate/Foundation/Auth/VerifiesEmails.php | <?php
namespace Illuminate\Foundation\Auth;
use Illuminate\Http\Request;
use Illuminate\Auth\Events\Verified;
trait VerifiesEmails
{
use RedirectsUsers;
/**
* Show the email verification notice.
*
* @param \Illuminate\Http\Request $request
* @return \Illuminate\Http\Response
*/
public function show(Request $request)
{
return $request->user()->hasVerifiedEmail()
? redirect($this->redirectPath())
: view('auth.verify');
}
/**
* Mark the authenticated user's email address as verified.
*
* @param \Illuminate\Http\Request $request
* @return \Illuminate\Http\Response
*/
public function verify(Request $request)
{
if ($request->route('id') == $request->user()->getKey() &&
$request->user()->markEmailAsVerified()) {
event(new Verified($request->user()));
}
return redirect($this->redirectPath())->with('verified', true);
}
/**
* Resend the email verification notification.
*
* @param \Illuminate\Http\Request $request
* @return \Illuminate\Http\Response
*/
public function resend(Request $request)
{
if ($request->user()->hasVerifiedEmail()) {
return redirect($this->redirectPath());
}
$request->user()->sendEmailVerificationNotification();
return back()->with('resent', true);
}
}
| <?php
namespace Illuminate\Foundation\Auth;
use Illuminate\Auth\Access\AuthorizationException;
use Illuminate\Http\Request;
use Illuminate\Auth\Events\Verified;
trait VerifiesEmails
{
use RedirectsUsers;
/**
* Show the email verification notice.
*
* @param \Illuminate\Http\Request $request
* @return \Illuminate\Http\Response
*/
public function show(Request $request)
{
return $request->user()->hasVerifiedEmail()
? redirect($this->redirectPath())
: view('auth.verify');
}
/**
* Mark the authenticated user's email address as verified.
*
* @param \Illuminate\Http\Request $request
* @return \Illuminate\Http\Response
* @throws AuthorizationException
*/
public function verify(Request $request)
{
if ($request->route('id') != $request->user()->getKey()) {
throw new AuthorizationException();
}
if ($request->user()->markEmailAsVerified()) {
event(new Verified($request->user()));
}
return redirect($this->redirectPath())->with('verified', true);
}
/**
* Resend the email verification notification.
*
* @param \Illuminate\Http\Request $request
* @return \Illuminate\Http\Response
*/
public function resend(Request $request)
{
if ($request->user()->hasVerifiedEmail()) {
return redirect($this->redirectPath());
}
$request->user()->sendEmailVerificationNotification();
return back()->with('resent', true);
}
}
| Add verify of current user id on email verification | Add verify of current user id on email verification
| PHP | mit | JamesForks/framework,cviebrock/framework,leo108/laravel_framework,stidges/framework,tjbp/framework,jerguslejko/framework,mul14/laravel-framework,tomschlick/framework,arturock/framework,barryvdh/framework,tomschlick/framework,jerguslejko/framework,ChristopheB/framework,srmkliveforks/framework,laravel/framework,halaei/framework,srmkliveforks/framework,JosephSilber/framework,arturock/framework,cviebrock/framework,morrislaptop/framework,cviebrock/framework,gms8994/framework,tjbp/framework,lucasmichot/framework,JamesForks/framework,notebowl/laravel-framework,morrislaptop/framework,leo108/laravel_framework,tjbp/framework,rodrigopedra/framework,stidges/framework,notebowl/laravel-framework,stidges/framework,lucasmichot/framework,stevebauman/framework,JosephSilber/framework,cviebrock/framework,srmkliveforks/framework,bytestream/framework,driesvints/framework,halaei/framework,laravel/framework,dwightwatson/framework,hafezdivandari/framework,rodrigopedra/framework,ChristopheB/framework,ChristopheB/framework,vlakoff/framework,mul14/laravel-framework,crynobone/framework,bytestream/framework,barryvdh/framework,barryvdh/framework,vlakoff/framework,samlev/framework,halaei/framework,cybercog/framework,mul14/laravel-framework,srmkliveforks/framework,vlakoff/framework,gms8994/framework,rodrigopedra/framework,jerguslejko/framework,barryvdh/framework,arturock/framework,jerguslejko/framework,crynobone/framework,mul14/laravel-framework,tomschlick/framework,dwightwatson/framework,hafezdivandari/framework,cybercog/framework,rodrigopedra/framework,cybercog/framework,driesvints/framework,stidges/framework,gms8994/framework,gms8994/framework,stevebauman/framework,samlev/framework | php | ## Code Before:
<?php
namespace Illuminate\Foundation\Auth;
use Illuminate\Http\Request;
use Illuminate\Auth\Events\Verified;
trait VerifiesEmails
{
use RedirectsUsers;
/**
* Show the email verification notice.
*
* @param \Illuminate\Http\Request $request
* @return \Illuminate\Http\Response
*/
public function show(Request $request)
{
return $request->user()->hasVerifiedEmail()
? redirect($this->redirectPath())
: view('auth.verify');
}
/**
* Mark the authenticated user's email address as verified.
*
* @param \Illuminate\Http\Request $request
* @return \Illuminate\Http\Response
*/
public function verify(Request $request)
{
if ($request->route('id') == $request->user()->getKey() &&
$request->user()->markEmailAsVerified()) {
event(new Verified($request->user()));
}
return redirect($this->redirectPath())->with('verified', true);
}
/**
* Resend the email verification notification.
*
* @param \Illuminate\Http\Request $request
* @return \Illuminate\Http\Response
*/
public function resend(Request $request)
{
if ($request->user()->hasVerifiedEmail()) {
return redirect($this->redirectPath());
}
$request->user()->sendEmailVerificationNotification();
return back()->with('resent', true);
}
}
## Instruction:
Add verify of current user id on email verification
## Code After:
<?php
namespace Illuminate\Foundation\Auth;
use Illuminate\Auth\Access\AuthorizationException;
use Illuminate\Http\Request;
use Illuminate\Auth\Events\Verified;
trait VerifiesEmails
{
use RedirectsUsers;
/**
* Show the email verification notice.
*
* @param \Illuminate\Http\Request $request
* @return \Illuminate\Http\Response
*/
public function show(Request $request)
{
return $request->user()->hasVerifiedEmail()
? redirect($this->redirectPath())
: view('auth.verify');
}
/**
* Mark the authenticated user's email address as verified.
*
* @param \Illuminate\Http\Request $request
* @return \Illuminate\Http\Response
* @throws AuthorizationException
*/
public function verify(Request $request)
{
if ($request->route('id') != $request->user()->getKey()) {
throw new AuthorizationException();
}
if ($request->user()->markEmailAsVerified()) {
event(new Verified($request->user()));
}
return redirect($this->redirectPath())->with('verified', true);
}
/**
* Resend the email verification notification.
*
* @param \Illuminate\Http\Request $request
* @return \Illuminate\Http\Response
*/
public function resend(Request $request)
{
if ($request->user()->hasVerifiedEmail()) {
return redirect($this->redirectPath());
}
$request->user()->sendEmailVerificationNotification();
return back()->with('resent', true);
}
}
|
b07a0a993b4bfaabb57aef416c81a70f933ebaf1 | src/main/webapp/index.html | src/main/webapp/index.html | <!DOCTYPE html>
<html lang="en-US">
<head>
<meta charset="UTF-8" />
<title>Reddit Dashboard</title>
<link rel="stylesheet" href="style.css" />
<script src="script.js"></script>
</head>
<body onload="addThreads()">
<div id="content">
<h1>Reddit Dashboard</h1>
<div id="thread-container"></div>
<button type="button" id="previous" onclick="previousPage()" value="Previous">Previous</button>
<span>Page</span>
<select name="pageNumber" id="pageNumer" onchange="numberPage(value)"></select>
<span id="amountOfPages"> </span>
<button type="button" id="next" onclick="nextPage()" value="Next">Next</button>
<p id="page-error"></p>
</div>
</body>
</html>
| <!DOCTYPE html>
<html lang="en-US">
<head>
<meta charset="UTF-8" />
<title>Youtube Comment Dashboard</title>
<link rel="stylesheet" href="style.css" />
<script src="script.js"></script>
</head>
<body onload="addThreads()">
<div id="content">
<h1>Youtube Comment Dashboard</h1>
<div id="thread-container"></div>
<button type="button" id="previous" onclick="previousPage()" value="Previous">Previous</button>
<span>Page</span>
<select name="pageNumber" id="pageNumer" onchange="numberPage(value)"></select>
<span id="amountOfPages"> </span>
<button type="button" id="next" onclick="nextPage()" value="Next">Next</button>
<p id="page-error"></p>
</div>
</body>
</html>
| Change the web page title | Change the web page title
| HTML | apache-2.0 | googleinterns/step64-2020,googleinterns/step64-2020,googleinterns/step64-2020 | html | ## Code Before:
<!DOCTYPE html>
<html lang="en-US">
<head>
<meta charset="UTF-8" />
<title>Reddit Dashboard</title>
<link rel="stylesheet" href="style.css" />
<script src="script.js"></script>
</head>
<body onload="addThreads()">
<div id="content">
<h1>Reddit Dashboard</h1>
<div id="thread-container"></div>
<button type="button" id="previous" onclick="previousPage()" value="Previous">Previous</button>
<span>Page</span>
<select name="pageNumber" id="pageNumer" onchange="numberPage(value)"></select>
<span id="amountOfPages"> </span>
<button type="button" id="next" onclick="nextPage()" value="Next">Next</button>
<p id="page-error"></p>
</div>
</body>
</html>
## Instruction:
Change the web page title
## Code After:
<!DOCTYPE html>
<html lang="en-US">
<head>
<meta charset="UTF-8" />
<title>Youtube Comment Dashboard</title>
<link rel="stylesheet" href="style.css" />
<script src="script.js"></script>
</head>
<body onload="addThreads()">
<div id="content">
<h1>Youtube Comment Dashboard</h1>
<div id="thread-container"></div>
<button type="button" id="previous" onclick="previousPage()" value="Previous">Previous</button>
<span>Page</span>
<select name="pageNumber" id="pageNumer" onchange="numberPage(value)"></select>
<span id="amountOfPages"> </span>
<button type="button" id="next" onclick="nextPage()" value="Next">Next</button>
<p id="page-error"></p>
</div>
</body>
</html>
|
01c5311c3027c893ddd76cfbec42d88baece1564 | lib/daab-run.js | lib/daab-run.js | // daab run
var fs = require('fs');
var spawn = require('child_process').spawn;
var program = require('commander');
var auth = require('./auth');
program
.allowUnknownOption()
.parse(process.argv);
if (! auth.hasToken()) {
console.log('At first, try "daab login"');
process.exit();
}
var hubot = spawn('bin/hubot', ['run'].concat(process.argv.slice(2)), {
stdio: 'inherit'
});
| // daab run
var fs = require('fs');
var spawn = require('child_process').spawn;
var program = require('commander');
var auth = require('./auth');
program
.allowUnknownOption()
.parse(process.argv);
if (! auth.hasToken()) {
console.log('At first, try "daab login"');
process.exit();
}
var cmd = process.platform === 'win32' ? 'bin\\hubot.cmd' : 'bin/hubot';
var hubot = spawn(cmd, ['run'].concat(process.argv.slice(2)), {
stdio: 'inherit'
});
| Fix launch command on windows platform. | Fix launch command on windows platform.
| JavaScript | mit | lisb/daab,lisb/daab,lisb/daab | javascript | ## Code Before:
// daab run
var fs = require('fs');
var spawn = require('child_process').spawn;
var program = require('commander');
var auth = require('./auth');
program
.allowUnknownOption()
.parse(process.argv);
if (! auth.hasToken()) {
console.log('At first, try "daab login"');
process.exit();
}
var hubot = spawn('bin/hubot', ['run'].concat(process.argv.slice(2)), {
stdio: 'inherit'
});
## Instruction:
Fix launch command on windows platform.
## Code After:
// daab run
var fs = require('fs');
var spawn = require('child_process').spawn;
var program = require('commander');
var auth = require('./auth');
program
.allowUnknownOption()
.parse(process.argv);
if (! auth.hasToken()) {
console.log('At first, try "daab login"');
process.exit();
}
var cmd = process.platform === 'win32' ? 'bin\\hubot.cmd' : 'bin/hubot';
var hubot = spawn(cmd, ['run'].concat(process.argv.slice(2)), {
stdio: 'inherit'
});
|
9dc025638dc28a6feaeeb5f604d40f536d67d5a3 | app/models/document.rb | app/models/document.rb | class Document
attr_reader :id, :filename
DIR_PATH = 'db/docs'
URL_PATH = 'docs'
class << self
def all
Dir.glob("#{DIR_PATH}/*.md").sort.map do |filename|
self.new(File.basename(filename, '.*'))
end
end
end
def initialize(filename)
@filename = filename
end
def path
"#{DIR_PATH}/#{self.filename}.md"
end
def url
"/#{URL_PATH}/#{self.filename}"
end
def exists?
return false if path.include? "\u0000"
File.exists?(path)
end
def title
@title ||= exists? ? self.content.lines.first[2..-1].strip.gsub('<br>', '') : ''
end
def description
@desc ||= exists? ? self.content.lines.reject{|l| l =~ /^(\n|<)/ }.second.gsub('<br>', '').strip : ''
end
def content
@content ||= exists? ? File.read(path) : ''
end
end
| class Document
attr_reader :id, :filename
DIR_PATH = 'db/docs'
URL_PATH = 'docs'
class << self
def all
Dir.glob("#{DIR_PATH}/*.md").sort.map do |filename|
self.new(File.basename(filename, '.*'))
end
end
end
def initialize(filename)
@filename = filename
end
def path
"#{DIR_PATH}/#{self.filename}.md"
end
def updated_at
Time.at(%x(git --no-pager log -1 --format=%ct "#{self.path}").to_i)
.utc.strftime "%Y-%m-%dT%H:%M:%SZ"
end
def url
"/#{URL_PATH}/#{self.filename}"
end
def exists?
return false if path.include? "\u0000"
File.exists?(path)
end
def title
@title ||= exists? ? self.content.lines.first[2..-1].strip.gsub('<br>', '') : ''
end
def description
@desc ||= exists? ? self.content.lines.reject{|l| l =~ /^(\n|<)/ }.second.gsub('<br>', '').strip : ''
end
def content
@content ||= exists? ? File.read(path) : ''
end
end
| Enable to fetch doc's updated_at from git log | Enable to fetch doc's updated_at from git log
| Ruby | mit | yasslab/coderdojo.jp,coderdojo-japan/coderdojo.jp,yasslab/coderdojo.jp,coderdojo-japan/coderdojo.jp,yasslab/coderdojo.jp,coderdojo-japan/coderdojo.jp,coderdojo-japan/coderdojo.jp | ruby | ## Code Before:
class Document
attr_reader :id, :filename
DIR_PATH = 'db/docs'
URL_PATH = 'docs'
class << self
def all
Dir.glob("#{DIR_PATH}/*.md").sort.map do |filename|
self.new(File.basename(filename, '.*'))
end
end
end
def initialize(filename)
@filename = filename
end
def path
"#{DIR_PATH}/#{self.filename}.md"
end
def url
"/#{URL_PATH}/#{self.filename}"
end
def exists?
return false if path.include? "\u0000"
File.exists?(path)
end
def title
@title ||= exists? ? self.content.lines.first[2..-1].strip.gsub('<br>', '') : ''
end
def description
@desc ||= exists? ? self.content.lines.reject{|l| l =~ /^(\n|<)/ }.second.gsub('<br>', '').strip : ''
end
def content
@content ||= exists? ? File.read(path) : ''
end
end
## Instruction:
Enable to fetch doc's updated_at from git log
## Code After:
class Document
attr_reader :id, :filename
DIR_PATH = 'db/docs'
URL_PATH = 'docs'
class << self
def all
Dir.glob("#{DIR_PATH}/*.md").sort.map do |filename|
self.new(File.basename(filename, '.*'))
end
end
end
def initialize(filename)
@filename = filename
end
def path
"#{DIR_PATH}/#{self.filename}.md"
end
def updated_at
Time.at(%x(git --no-pager log -1 --format=%ct "#{self.path}").to_i)
.utc.strftime "%Y-%m-%dT%H:%M:%SZ"
end
def url
"/#{URL_PATH}/#{self.filename}"
end
def exists?
return false if path.include? "\u0000"
File.exists?(path)
end
def title
@title ||= exists? ? self.content.lines.first[2..-1].strip.gsub('<br>', '') : ''
end
def description
@desc ||= exists? ? self.content.lines.reject{|l| l =~ /^(\n|<)/ }.second.gsub('<br>', '').strip : ''
end
def content
@content ||= exists? ? File.read(path) : ''
end
end
|
4d7402adb5e9f97c299c5f4fb6b2c62915298962 | requirements.txt | requirements.txt | mock>=1.01
pathlib>=1.0
python-dateutil>=2.3
| mock>=1.0.1
nose>=1.3.4
pathlib>=1.0
python-dateutil>=2.3
| Correct version string for mock. Added nose. | Correct version string for mock. Added nose.
| Text | bsd-3-clause | NaturalHistoryMuseum/inselect,NaturalHistoryMuseum/inselect | text | ## Code Before:
mock>=1.01
pathlib>=1.0
python-dateutil>=2.3
## Instruction:
Correct version string for mock. Added nose.
## Code After:
mock>=1.0.1
nose>=1.3.4
pathlib>=1.0
python-dateutil>=2.3
|
892386c344d2c19f097053eb87fa70b0e3d239d1 | SuperBuild/External_SimpleITKExamples.cmake | SuperBuild/External_SimpleITKExamples.cmake |
set(proj SimpleITKExamples)
# Set dependency list
set(SimpleITKExamples_DEPENDENCIES "SimpleITK")
if (${BUILD_EXAMPLES} )
file(WRITE "${CMAKE_CURRENT_BINARY_DIR}/${proj}-build/CMakeCacheInit.txt" "${ep_common_cache}" )
ExternalProject_Add(${proj}
DOWNLOAD_COMMAND ""
UPDATE_COMMAND ""
SOURCE_DIR ${CMAKE_CURRENT_SOURCE_DIR}/../Examples
BINARY_DIR ${proj}-build
CMAKE_GENERATOR ${gen}
CMAKE_ARGS
--no-warn-unused-cli
-C "${CMAKE_CURRENT_BINARY_DIR}/${proj}-build/CMakeCacheInit.txt"
${ep_common_args}
-DSimpleITK_DIR:PATH=${CMAKE_CURRENT_BINARY_DIR}/lib/cmake/SimpleITK-0.11/
-DCMAKE_SKIP_RPATH:BOOL=ON
-DCMAKE_INSTALL_PREFIX:PATH=<INSTALL_DIR>
BUILD_COMMAND ${BUILD_COMMAND_STRING}
INSTALL_COMMAND ""
DEPENDS "${SimpleITKExamples_DEPENDENCIES}"
${External_Project_USES_TERMINAL}
)
endif()
|
set(proj SimpleITKExamples)
# Set dependency list
set(SimpleITKExamples_DEPENDENCIES "SimpleITK")
if (${BUILD_EXAMPLES} )
file(WRITE "${CMAKE_CURRENT_BINARY_DIR}/${proj}-build/CMakeCacheInit.txt" "${ep_common_cache}" )
ExternalProject_Add(${proj}
DOWNLOAD_COMMAND ""
UPDATE_COMMAND ""
SOURCE_DIR ${CMAKE_CURRENT_SOURCE_DIR}/../Examples
BINARY_DIR ${proj}-build
CMAKE_GENERATOR ${gen}
CMAKE_ARGS
--no-warn-unused-cli
-C "${CMAKE_CURRENT_BINARY_DIR}/${proj}-build/CMakeCacheInit.txt"
${ep_common_args}
-DSimpleITK_DIR:PATH=${CMAKE_INSTALL_PREFIX}/lib/cmake/SimpleITK-1.0/
-DCMAKE_SKIP_RPATH:BOOL=ON
-DCMAKE_INSTALL_PREFIX:PATH=<INSTALL_DIR>
BUILD_COMMAND ${BUILD_COMMAND_STRING}
INSTALL_COMMAND ""
DEPENDS "${SimpleITKExamples_DEPENDENCIES}"
${External_Project_USES_TERMINAL}
)
endif()
| Fix the SimpleITK install directory used for superbuild SimpleITK Examples | Fix the SimpleITK install directory used for superbuild SimpleITK Examples
The version was not updated to SimpleITK version 1.0 and the binary
build directory was used not the install directory.
| CMake | apache-2.0 | kaspermarstal/SimpleElastix,kaspermarstal/SimpleElastix,kaspermarstal/SimpleElastix,kaspermarstal/SimpleElastix,InsightSoftwareConsortium/SimpleITK,InsightSoftwareConsortium/SimpleITK,blowekamp/SimpleITK,SimpleITK/SimpleITK,richardbeare/SimpleITK,SimpleITK/SimpleITK,InsightSoftwareConsortium/SimpleITK,InsightSoftwareConsortium/SimpleITK,richardbeare/SimpleITK,InsightSoftwareConsortium/SimpleITK,InsightSoftwareConsortium/SimpleITK,kaspermarstal/SimpleElastix,kaspermarstal/SimpleElastix,InsightSoftwareConsortium/SimpleITK,blowekamp/SimpleITK,SimpleITK/SimpleITK,blowekamp/SimpleITK,kaspermarstal/SimpleElastix,blowekamp/SimpleITK,richardbeare/SimpleITK,richardbeare/SimpleITK,richardbeare/SimpleITK,InsightSoftwareConsortium/SimpleITK,SimpleITK/SimpleITK,blowekamp/SimpleITK,SimpleITK/SimpleITK,blowekamp/SimpleITK,richardbeare/SimpleITK,SimpleITK/SimpleITK,richardbeare/SimpleITK,blowekamp/SimpleITK,SimpleITK/SimpleITK,SimpleITK/SimpleITK,blowekamp/SimpleITK,richardbeare/SimpleITK | cmake | ## Code Before:
set(proj SimpleITKExamples)
# Set dependency list
set(SimpleITKExamples_DEPENDENCIES "SimpleITK")
if (${BUILD_EXAMPLES} )
file(WRITE "${CMAKE_CURRENT_BINARY_DIR}/${proj}-build/CMakeCacheInit.txt" "${ep_common_cache}" )
ExternalProject_Add(${proj}
DOWNLOAD_COMMAND ""
UPDATE_COMMAND ""
SOURCE_DIR ${CMAKE_CURRENT_SOURCE_DIR}/../Examples
BINARY_DIR ${proj}-build
CMAKE_GENERATOR ${gen}
CMAKE_ARGS
--no-warn-unused-cli
-C "${CMAKE_CURRENT_BINARY_DIR}/${proj}-build/CMakeCacheInit.txt"
${ep_common_args}
-DSimpleITK_DIR:PATH=${CMAKE_CURRENT_BINARY_DIR}/lib/cmake/SimpleITK-0.11/
-DCMAKE_SKIP_RPATH:BOOL=ON
-DCMAKE_INSTALL_PREFIX:PATH=<INSTALL_DIR>
BUILD_COMMAND ${BUILD_COMMAND_STRING}
INSTALL_COMMAND ""
DEPENDS "${SimpleITKExamples_DEPENDENCIES}"
${External_Project_USES_TERMINAL}
)
endif()
## Instruction:
Fix the SimpleITK install directory used for superbuild SimpleITK Examples
The version was not updated to SimpleITK version 1.0 and the binary
build directory was used not the install directory.
## Code After:
set(proj SimpleITKExamples)
# Set dependency list
set(SimpleITKExamples_DEPENDENCIES "SimpleITK")
if (${BUILD_EXAMPLES} )
file(WRITE "${CMAKE_CURRENT_BINARY_DIR}/${proj}-build/CMakeCacheInit.txt" "${ep_common_cache}" )
ExternalProject_Add(${proj}
DOWNLOAD_COMMAND ""
UPDATE_COMMAND ""
SOURCE_DIR ${CMAKE_CURRENT_SOURCE_DIR}/../Examples
BINARY_DIR ${proj}-build
CMAKE_GENERATOR ${gen}
CMAKE_ARGS
--no-warn-unused-cli
-C "${CMAKE_CURRENT_BINARY_DIR}/${proj}-build/CMakeCacheInit.txt"
${ep_common_args}
-DSimpleITK_DIR:PATH=${CMAKE_INSTALL_PREFIX}/lib/cmake/SimpleITK-1.0/
-DCMAKE_SKIP_RPATH:BOOL=ON
-DCMAKE_INSTALL_PREFIX:PATH=<INSTALL_DIR>
BUILD_COMMAND ${BUILD_COMMAND_STRING}
INSTALL_COMMAND ""
DEPENDS "${SimpleITKExamples_DEPENDENCIES}"
${External_Project_USES_TERMINAL}
)
endif()
|
a91498c793f73551027a3c408befaa1b2de4b3da | public/css/style.css | public/css/style.css | .add-item {
margin: 6px;
} | .add-item {
margin: 6px;
}
.item-cost {
font-family: "Lucida Console", Monaco, monospace;
}
| Use monospace font for cost | Use monospace font for cost
| CSS | mit | hasyimibhar/bajet,hasyimibhar/bajet | css | ## Code Before:
.add-item {
margin: 6px;
}
## Instruction:
Use monospace font for cost
## Code After:
.add-item {
margin: 6px;
}
.item-cost {
font-family: "Lucida Console", Monaco, monospace;
}
|
d3f7c1a592a6f81befa89f51226ed0a97295a157 | CONTRIBUTING.md | CONTRIBUTING.md |
- **To add to the list:** Submit a pull request
- **To remove from the list:** Open an issue
- List items should be sorted *alphabetically*.
- Each item should be limited to one link
- The link should be the name of the package or project
- Direct installation commands should follow on the next line, indented by 2 spaces and enclosed in \`\`
- Descriptions should be clear, concise, and non-promotional
- Descriptions should follow the link, on the same line
## Quality standard
To stay on the list, package repositories should adhere to these quality standards:
- Generally useful to the community
- Functional
- Stable
## Reporting issues
Please open an issue if you find anything that could be improved or have suggestions for making the list a more valuable resource. Thanks!
|
- **To add to the list:** Submit a pull request
- **To remove from the list:** Open an issue
- List items should be sorted *alphabetically*.
- Each item should be limited to one link
- The link should be the name of the package or project
- Direct installation commands should follow on the next line, indented by 2 spaces and enclosed in \`\`
- Descriptions should be clear, concise, and non-promotional
- Descriptions should follow the link, on the same line
- Run `npm install` and then `npm test` to verify everything is correct according to guidelines
## Quality standard
To stay on the list, package repositories should adhere to these quality standards:
- Generally useful to the community
- Functional
- Stable
## Reporting issues
Please open an issue if you find anything that could be improved or have suggestions for making the list a more valuable resource. Thanks!
| Update instructions in Contributing to include testing information | Update instructions in Contributing to include testing information
| Markdown | mit | davidlebr1/awesome-ctf,apsdehal/awesome-ctf,601040605/awesome-ctf,shekkbuilder/awesome-ctf,jack51706/awesome-ctf,aancw/awesome-ctf,sillvan/awesome-ctf | markdown | ## Code Before:
- **To add to the list:** Submit a pull request
- **To remove from the list:** Open an issue
- List items should be sorted *alphabetically*.
- Each item should be limited to one link
- The link should be the name of the package or project
- Direct installation commands should follow on the next line, indented by 2 spaces and enclosed in \`\`
- Descriptions should be clear, concise, and non-promotional
- Descriptions should follow the link, on the same line
## Quality standard
To stay on the list, package repositories should adhere to these quality standards:
- Generally useful to the community
- Functional
- Stable
## Reporting issues
Please open an issue if you find anything that could be improved or have suggestions for making the list a more valuable resource. Thanks!
## Instruction:
Update instructions in Contributing to include testing information
## Code After:
- **To add to the list:** Submit a pull request
- **To remove from the list:** Open an issue
- List items should be sorted *alphabetically*.
- Each item should be limited to one link
- The link should be the name of the package or project
- Direct installation commands should follow on the next line, indented by 2 spaces and enclosed in \`\`
- Descriptions should be clear, concise, and non-promotional
- Descriptions should follow the link, on the same line
- Run `npm install` and then `npm test` to verify everything is correct according to guidelines
## Quality standard
To stay on the list, package repositories should adhere to these quality standards:
- Generally useful to the community
- Functional
- Stable
## Reporting issues
Please open an issue if you find anything that could be improved or have suggestions for making the list a more valuable resource. Thanks!
|
5fa2dfacd0967842842067429122732d2b50faee | app/views/surveys/question_templates/_prefer_not_to_answer_box.html.haml | app/views/surveys/question_templates/_prefer_not_to_answer_box.html.haml | .input-group.center-block
.check-box-container.preferred-not-to-answer
= check_box_tag "#{question.to_param}[preferred_not_to_answer]", "1", answer.preferred_not_to_answer?, disabled: answer.locked?, name: 'response[preferred_not_to_answer]'
%label{ for: "#{question.to_param}[preferred_not_to_answer]", style: "#{'cursor: default' if answer.locked?}", class: "#{'hidden' if (answer.locked? and !answer.preferred_not_to_answer?)}" }
%span.hotkey{ style: "#{'cursor: default' if answer.locked?}" } ✓
%p Prefer not to answer
| .input-group.center-block
.check-box-container.preferred-not-to-answer
= check_box_tag "#{question.to_param}[preferred_not_to_answer]", "1", answer.preferred_not_to_answer?, disabled: answer.locked?, name: 'response[preferred_not_to_answer]'
%label{ for: "#{question.to_param}[preferred_not_to_answer]", style: "#{'cursor: default' if answer.locked?}", class: "#{'hidden' if (answer.locked? and !answer.preferred_not_to_answer?)}" }
%span.hotkey{ style: "#{'cursor: default' if answer.locked?}" } ✓
%p Skip this question
| Use similar text for skipping a question, to be consistent with ResearchKit app | Use similar text for skipping a question, to be consistent with ResearchKit app
| Haml | mit | myapnea/www.myapnea.org,myapnea/www.myapnea.org,myapnea/www.myapnea.org | haml | ## Code Before:
.input-group.center-block
.check-box-container.preferred-not-to-answer
= check_box_tag "#{question.to_param}[preferred_not_to_answer]", "1", answer.preferred_not_to_answer?, disabled: answer.locked?, name: 'response[preferred_not_to_answer]'
%label{ for: "#{question.to_param}[preferred_not_to_answer]", style: "#{'cursor: default' if answer.locked?}", class: "#{'hidden' if (answer.locked? and !answer.preferred_not_to_answer?)}" }
%span.hotkey{ style: "#{'cursor: default' if answer.locked?}" } ✓
%p Prefer not to answer
## Instruction:
Use similar text for skipping a question, to be consistent with ResearchKit app
## Code After:
.input-group.center-block
.check-box-container.preferred-not-to-answer
= check_box_tag "#{question.to_param}[preferred_not_to_answer]", "1", answer.preferred_not_to_answer?, disabled: answer.locked?, name: 'response[preferred_not_to_answer]'
%label{ for: "#{question.to_param}[preferred_not_to_answer]", style: "#{'cursor: default' if answer.locked?}", class: "#{'hidden' if (answer.locked? and !answer.preferred_not_to_answer?)}" }
%span.hotkey{ style: "#{'cursor: default' if answer.locked?}" } ✓
%p Skip this question
|
0613d7b749b6bfffc6094b1c382898d235c884f1 | data/docker-storage-setup-env.sh | data/docker-storage-setup-env.sh |
STORAGE_DRIVER=devicemapper
DEVS=/dev/sdb
VG=dockervg
DATA_SIZE=40%FREE
MIN_DATA_SIZE=2G
CHUNK_SIZE=512K
GROWPART=false
AUTO_EXTEND_POOL=yes
POOL_AUTOEXTEND_THRESHOLD=60
POOL_AUTOEXTEND_PERCENT=20
[ -e docker-storage-setup-env.local.sh ] && . docker-storage-setup-env.local.sh
|
DEVS=/dev/sdb
VG=dockervg
[ -e docker-storage-setup-env.local.sh ] && . docker-storage-setup-env.local.sh
| Remove initial override directives that are of same value from the conf in the package. | Remove initial override directives that are of same value from the conf in the package.
| Shell | apache-2.0 | flaccid/vagrant-docker-storage-setup,flaccid/vagrant-docker-storage-setup | shell | ## Code Before:
STORAGE_DRIVER=devicemapper
DEVS=/dev/sdb
VG=dockervg
DATA_SIZE=40%FREE
MIN_DATA_SIZE=2G
CHUNK_SIZE=512K
GROWPART=false
AUTO_EXTEND_POOL=yes
POOL_AUTOEXTEND_THRESHOLD=60
POOL_AUTOEXTEND_PERCENT=20
[ -e docker-storage-setup-env.local.sh ] && . docker-storage-setup-env.local.sh
## Instruction:
Remove initial override directives that are of same value from the conf in the package.
## Code After:
DEVS=/dev/sdb
VG=dockervg
[ -e docker-storage-setup-env.local.sh ] && . docker-storage-setup-env.local.sh
|
089bccd649dd2479fe71d9a80069ec905933660f | index.php | index.php | <?php
require_once 'recaptcha.class.php';
$form_id = "form";
$site_key = 'YOUR_SITE_KEY';
$secret_key = 'YOUR_SECRET_KEY';
$recaptcha = new Recaptcha($form_id, $site_key, $secret_key);
?>
<!doctype html>
<html>
<head>
<title>Invisible ReCAPTCHA API Class Demo</title>
</head>
<body>
<?php ob_start(); ?>
<form id="form" method="post">
<input type="text" name="example" value="" placeholder="Example Field" />
<?php echo $recaptcha->button(); ?>
</form>
<?php
$form = ob_get_clean();
if(!empty($_POST)) {
$response = $recaptcha->get_response();
echo "Success: ". ( $response->success ? 'true' : 'false' ) ."<br />\r\n";
echo "Score: ". $response->score ."<br />\r\n";
if($response->success) {
echo "Passed Captcha.";
} else {
echo "Failed Captcha.";
echo $form;
}
} else {
echo $form;
}
echo $recaptcha->script();
?>
</body>
</html>
| <?php
require_once 'recaptcha.class.php';
$form_id = "form";
$site_key = 'YOUR_SITE_KEY';
$secret_key = 'YOUR_SECRET_KEY';
$recaptcha = new Recaptcha($form_id, $site_key, $secret_key);
?>
<!doctype html>
<html>
<head>
<title>ReCAPTCHA v3 API Class Demo</title>
</head>
<body>
<?php ob_start(); ?>
<form id="form" method="post">
<input type="text" name="example" value="" placeholder="Example Field" />
<?php echo $recaptcha->button(); ?>
</form>
<?php
$form = ob_get_clean();
if(!empty($_POST)) {
$response = $recaptcha->get_response();
echo "Success: ". ( $response->success ? 'true' : 'false' ) ."<br />\r\n";
echo "Score: ". $response->score ."<br />\r\n";
if($response->success) {
echo "Passed Captcha.";
} else {
echo "Failed Captcha.";
echo $form;
}
} else {
echo $form;
}
echo $recaptcha->script();
?>
</body>
</html>
| Update title to ReCAPTCHA v3 | Update title to ReCAPTCHA v3 | PHP | mit | svenagen/recaptcha,JoelLisenby/recaptcha | php | ## Code Before:
<?php
require_once 'recaptcha.class.php';
$form_id = "form";
$site_key = 'YOUR_SITE_KEY';
$secret_key = 'YOUR_SECRET_KEY';
$recaptcha = new Recaptcha($form_id, $site_key, $secret_key);
?>
<!doctype html>
<html>
<head>
<title>Invisible ReCAPTCHA API Class Demo</title>
</head>
<body>
<?php ob_start(); ?>
<form id="form" method="post">
<input type="text" name="example" value="" placeholder="Example Field" />
<?php echo $recaptcha->button(); ?>
</form>
<?php
$form = ob_get_clean();
if(!empty($_POST)) {
$response = $recaptcha->get_response();
echo "Success: ". ( $response->success ? 'true' : 'false' ) ."<br />\r\n";
echo "Score: ". $response->score ."<br />\r\n";
if($response->success) {
echo "Passed Captcha.";
} else {
echo "Failed Captcha.";
echo $form;
}
} else {
echo $form;
}
echo $recaptcha->script();
?>
</body>
</html>
## Instruction:
Update title to ReCAPTCHA v3
## Code After:
<?php
require_once 'recaptcha.class.php';
$form_id = "form";
$site_key = 'YOUR_SITE_KEY';
$secret_key = 'YOUR_SECRET_KEY';
$recaptcha = new Recaptcha($form_id, $site_key, $secret_key);
?>
<!doctype html>
<html>
<head>
<title>ReCAPTCHA v3 API Class Demo</title>
</head>
<body>
<?php ob_start(); ?>
<form id="form" method="post">
<input type="text" name="example" value="" placeholder="Example Field" />
<?php echo $recaptcha->button(); ?>
</form>
<?php
$form = ob_get_clean();
if(!empty($_POST)) {
$response = $recaptcha->get_response();
echo "Success: ". ( $response->success ? 'true' : 'false' ) ."<br />\r\n";
echo "Score: ". $response->score ."<br />\r\n";
if($response->success) {
echo "Passed Captcha.";
} else {
echo "Failed Captcha.";
echo $form;
}
} else {
echo $form;
}
echo $recaptcha->script();
?>
</body>
</html>
|
db9e7163a9aef3f41af2e71f074611dae0f5ac39 | lispkit.lisp | lispkit.lisp | (defpackage lispkit
(:use :gtk :gdk :gdk-pixbuf :gobject
:drakma :cl-webkit
:glib :gio :pango :cairo :common-lisp)
(:export #:main))
(in-package :lispkit)
(defparameter *current-tab* nil)
(defun handle-key (window event)
(declare (ignore window))
(print event)
nil)
(defun main (&rest args)
(declare (ignore args))
(within-main-loop
(let ((window (make-instance 'gtk:gtk-window :title "lispkit!"))
(view (setq *current-tab*
(webkit.foreign:webkit-web-view-new))))
(gtk-container-add window view)
(gtk-container-add window (make-instance 'gtk-scrolled-window))
(webkit.foreign:webkit-web-view-load-uri
view "http://www.github.com/AeroNotix/lispkit")
(g-signal-connect window "key_press_event"
#'handle-key)
(gtk-widget-show-all window))))
| (defpackage lispkit
(:use :gtk :gdk :gdk-pixbuf :gobject
:drakma :cl-webkit
:glib :gio :pango :cairo :common-lisp)
(:export #:main))
(in-package :lispkit)
(defparameter *current-tab* nil)
(defun load-url (url &optional view)
(webkit.foreign:webkit-web-view-load-uri
(or view *current-tab*) url))
(defun handle-key (window event)
(declare (ignore window))
(print event)
nil)
(defun main (&rest args)
(declare (ignore args))
(within-main-loop
(let ((window (make-instance 'gtk:gtk-window :title "lispkit!"))
(view (setq *current-tab*
(webkit.foreign:webkit-web-view-new))))
(gtk-container-add window view)
(gtk-container-add window (make-instance 'gtk-scrolled-window))
(g-signal-connect window "key_press_event"
#'handle-key)
(load-url "http://www.github.com/AeroNotix/lispkit")
(gtk-widget-show-all window))))
| Put load-url into a function | Put load-url into a function
| Common Lisp | bsd-2-clause | AeroNotix/lispkit,AeroNotix/lispkit | common-lisp | ## Code Before:
(defpackage lispkit
(:use :gtk :gdk :gdk-pixbuf :gobject
:drakma :cl-webkit
:glib :gio :pango :cairo :common-lisp)
(:export #:main))
(in-package :lispkit)
(defparameter *current-tab* nil)
(defun handle-key (window event)
(declare (ignore window))
(print event)
nil)
(defun main (&rest args)
(declare (ignore args))
(within-main-loop
(let ((window (make-instance 'gtk:gtk-window :title "lispkit!"))
(view (setq *current-tab*
(webkit.foreign:webkit-web-view-new))))
(gtk-container-add window view)
(gtk-container-add window (make-instance 'gtk-scrolled-window))
(webkit.foreign:webkit-web-view-load-uri
view "http://www.github.com/AeroNotix/lispkit")
(g-signal-connect window "key_press_event"
#'handle-key)
(gtk-widget-show-all window))))
## Instruction:
Put load-url into a function
## Code After:
(defpackage lispkit
(:use :gtk :gdk :gdk-pixbuf :gobject
:drakma :cl-webkit
:glib :gio :pango :cairo :common-lisp)
(:export #:main))
(in-package :lispkit)
(defparameter *current-tab* nil)
(defun load-url (url &optional view)
(webkit.foreign:webkit-web-view-load-uri
(or view *current-tab*) url))
(defun handle-key (window event)
(declare (ignore window))
(print event)
nil)
(defun main (&rest args)
(declare (ignore args))
(within-main-loop
(let ((window (make-instance 'gtk:gtk-window :title "lispkit!"))
(view (setq *current-tab*
(webkit.foreign:webkit-web-view-new))))
(gtk-container-add window view)
(gtk-container-add window (make-instance 'gtk-scrolled-window))
(g-signal-connect window "key_press_event"
#'handle-key)
(load-url "http://www.github.com/AeroNotix/lispkit")
(gtk-widget-show-all window))))
|
d299c8cf01a856e0c53599e0d0e45ef9063ccad7 | engines/order_management/spec/services/order_management/stock/packer_spec.rb | engines/order_management/spec/services/order_management/stock/packer_spec.rb |
require 'spec_helper'
module OrderManagement
module Stock
describe Packer do
let(:order) { create(:order_with_line_items, line_items_count: 5) }
let(:stock_location) { create(:stock_location) }
subject { Packer.new(stock_location, order) }
before { order.line_items.first.variant.weight = 1 }
it 'builds a package with all the items' do
package = subject.package
expect(package.contents.size).to eq 5
expect(package.weight).to be_positive
end
it 'variants are added as backordered without enough on_hand' do
expect(stock_location).to receive(:fill_status).exactly(5).times.and_return([2, 3])
package = subject.package
expect(package.on_hand.size).to eq 5
expect(package.backordered.size).to eq 5
end
end
end
end
|
require 'spec_helper'
module OrderManagement
module Stock
describe Packer do
let(:order) { create(:order_with_line_items, line_items_count: 5) }
let(:stock_location) { create(:stock_location) }
subject { Packer.new(stock_location, order) }
before { order.line_items.first.variant.update(unit_value: 100) }
it 'builds a package with all the items' do
package = subject.package
expect(package.contents.size).to eq 5
expect(package.weight).to be_positive
end
it 'variants are added as backordered without enough on_hand' do
expect(stock_location).to receive(:fill_status).exactly(5).times.and_return([2, 3])
package = subject.package
expect(package.on_hand.size).to eq 5
expect(package.backordered.size).to eq 5
end
end
end
end
| Update test setup in Packer spec. | Update test setup in Packer spec.
This is the correct way to set a variant's weight.
| Ruby | agpl-3.0 | lin-d-hop/openfoodnetwork,Matt-Yorkley/openfoodnetwork,openfoodfoundation/openfoodnetwork,lin-d-hop/openfoodnetwork,mkllnk/openfoodnetwork,lin-d-hop/openfoodnetwork,mkllnk/openfoodnetwork,openfoodfoundation/openfoodnetwork,Matt-Yorkley/openfoodnetwork,openfoodfoundation/openfoodnetwork,lin-d-hop/openfoodnetwork,mkllnk/openfoodnetwork,Matt-Yorkley/openfoodnetwork,mkllnk/openfoodnetwork,openfoodfoundation/openfoodnetwork,Matt-Yorkley/openfoodnetwork | ruby | ## Code Before:
require 'spec_helper'
module OrderManagement
module Stock
describe Packer do
let(:order) { create(:order_with_line_items, line_items_count: 5) }
let(:stock_location) { create(:stock_location) }
subject { Packer.new(stock_location, order) }
before { order.line_items.first.variant.weight = 1 }
it 'builds a package with all the items' do
package = subject.package
expect(package.contents.size).to eq 5
expect(package.weight).to be_positive
end
it 'variants are added as backordered without enough on_hand' do
expect(stock_location).to receive(:fill_status).exactly(5).times.and_return([2, 3])
package = subject.package
expect(package.on_hand.size).to eq 5
expect(package.backordered.size).to eq 5
end
end
end
end
## Instruction:
Update test setup in Packer spec.
This is the correct way to set a variant's weight.
## Code After:
require 'spec_helper'
module OrderManagement
module Stock
describe Packer do
let(:order) { create(:order_with_line_items, line_items_count: 5) }
let(:stock_location) { create(:stock_location) }
subject { Packer.new(stock_location, order) }
before { order.line_items.first.variant.update(unit_value: 100) }
it 'builds a package with all the items' do
package = subject.package
expect(package.contents.size).to eq 5
expect(package.weight).to be_positive
end
it 'variants are added as backordered without enough on_hand' do
expect(stock_location).to receive(:fill_status).exactly(5).times.and_return([2, 3])
package = subject.package
expect(package.on_hand.size).to eq 5
expect(package.backordered.size).to eq 5
end
end
end
end
|
f8ac81301df0950c8fcd333a4d8954b9ab3736b6 | tools/gen-code-from-swagger.sh | tools/gen-code-from-swagger.sh | version=0.15.0
# Always points to the directory of this script.
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
SWAGGER=$DIR/swagger-${version}
if [ ! -f $SWAGGER ]; then
curl -o $SWAGGER -L'#' https://github.com/go-swagger/go-swagger/releases/download/$version/swagger_$(echo `uname`|tr '[:upper:]' '[:lower:]')_amd64
chmod +x $SWAGGER
fi
(cd $DIR/..; $SWAGGER generate server --name=weaviate --spec=openapi-specs/schema.json --default-scheme=https)
(cd $DIR/..; $SWAGGER generate client --spec=openapi-specs/schema.json --default-scheme=https)
# Now add the header to the generated code too.
$DIR/add_header.sh
| version=0.15.0
# Always points to the directory of this script.
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
SWAGGER=$DIR/swagger-${version}
if [ ! -f $SWAGGER ]; then
curl -o $SWAGGER -L'#' https://github.com/go-swagger/go-swagger/releases/download/$version/swagger_$(echo `uname`|tr '[:upper:]' '[:lower:]')_amd64
chmod +x $SWAGGER
fi
# Remove old stuff.
(cd $DIR/; rm -rf models restapi/operations/)
(cd $DIR/..; $SWAGGER generate server --name=weaviate --spec=openapi-specs/schema.json --default-scheme=https)
(cd $DIR/..; $SWAGGER generate client --spec=openapi-specs/schema.json --default-scheme=https)
# Now add the header to the generated code too.
$DIR/add_header.sh
| Remove all models/operations before generating new ones | gh-396: Remove all models/operations before generating new ones
This is an improvement, because otherwise now non-existing models and
operations will be left behind as cruft.
| Shell | bsd-3-clause | weaviate/weaviate,weaviate/weaviate | shell | ## Code Before:
version=0.15.0
# Always points to the directory of this script.
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
SWAGGER=$DIR/swagger-${version}
if [ ! -f $SWAGGER ]; then
curl -o $SWAGGER -L'#' https://github.com/go-swagger/go-swagger/releases/download/$version/swagger_$(echo `uname`|tr '[:upper:]' '[:lower:]')_amd64
chmod +x $SWAGGER
fi
(cd $DIR/..; $SWAGGER generate server --name=weaviate --spec=openapi-specs/schema.json --default-scheme=https)
(cd $DIR/..; $SWAGGER generate client --spec=openapi-specs/schema.json --default-scheme=https)
# Now add the header to the generated code too.
$DIR/add_header.sh
## Instruction:
gh-396: Remove all models/operations before generating new ones
This is an improvement, because otherwise now non-existing models and
operations will be left behind as cruft.
## Code After:
version=0.15.0
# Always points to the directory of this script.
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
SWAGGER=$DIR/swagger-${version}
if [ ! -f $SWAGGER ]; then
curl -o $SWAGGER -L'#' https://github.com/go-swagger/go-swagger/releases/download/$version/swagger_$(echo `uname`|tr '[:upper:]' '[:lower:]')_amd64
chmod +x $SWAGGER
fi
# Remove old stuff.
(cd $DIR/; rm -rf models restapi/operations/)
(cd $DIR/..; $SWAGGER generate server --name=weaviate --spec=openapi-specs/schema.json --default-scheme=https)
(cd $DIR/..; $SWAGGER generate client --spec=openapi-specs/schema.json --default-scheme=https)
# Now add the header to the generated code too.
$DIR/add_header.sh
|
21ec062c9a6c9661b7a3808fbbbaf1681701760a | src/elements/linkedin-icon/index.js | src/elements/linkedin-icon/index.js | //@flow
import React from "react";
import { Link } from "react-router-dom";
import { pure, compose, withState, withHandlers } from "recompose";
import { ThemeProvider } from "styled-components";
import mainTheme from "../../global/style/mainTheme";
import { Logo } from "./Logo";
const enhance = compose(
withState("isActive", "setActive", false),
withHandlers({
addActive: props => () => props.setActive(true),
rmActive: props => () => props.setActive(false)
}),
pure
);
export const LinkedInLogo = enhance(
({
addActive,
rmActive,
isActive
}: {
addActive: () => any,
rmActive: () => any,
isActive: boolean
}) => (
<ThemeProvider theme={mainTheme}>
<Link
to="https://www.linkedin.com/in/oliver-askew-5791a333/"
onMouseEnter={addActive}
onMouseLeave={rmActive}
>
<Logo isActive={isActive} />
</Link>
</ThemeProvider>
)
);
| //@flow
import React from "react";
import { pure, compose, withState, withHandlers } from "recompose";
import { ThemeProvider } from "styled-components";
import mainTheme from "../../global/style/mainTheme";
import { Logo } from "./Logo";
const enhance = compose(
withState("isActive", "setActive", false),
withHandlers({
addActive: props => () => props.setActive(true),
rmActive: props => () => props.setActive(false)
}),
pure
);
export const LinkedInLogo = enhance(
({
addActive,
rmActive,
isActive,
...props
}: {
addActive: () => any,
rmActive: () => any,
isActive: boolean
}) => (
<ThemeProvider theme={mainTheme}>
<a
href="https://www.linkedin.com/in/oliver-askew-5791a333/"
onMouseEnter={addActive}
onMouseLeave={rmActive}
>
<Logo isActive={isActive} />
</a>
</ThemeProvider>
)
);
| Change Link component for a tag in LinkedIn logo | Change Link component for a tag in LinkedIn logo
external links require an a tag to work correctly | JavaScript | mit | slightly-askew/portfolio-2017,slightly-askew/portfolio-2017 | javascript | ## Code Before:
//@flow
import React from "react";
import { Link } from "react-router-dom";
import { pure, compose, withState, withHandlers } from "recompose";
import { ThemeProvider } from "styled-components";
import mainTheme from "../../global/style/mainTheme";
import { Logo } from "./Logo";
const enhance = compose(
withState("isActive", "setActive", false),
withHandlers({
addActive: props => () => props.setActive(true),
rmActive: props => () => props.setActive(false)
}),
pure
);
export const LinkedInLogo = enhance(
({
addActive,
rmActive,
isActive
}: {
addActive: () => any,
rmActive: () => any,
isActive: boolean
}) => (
<ThemeProvider theme={mainTheme}>
<Link
to="https://www.linkedin.com/in/oliver-askew-5791a333/"
onMouseEnter={addActive}
onMouseLeave={rmActive}
>
<Logo isActive={isActive} />
</Link>
</ThemeProvider>
)
);
## Instruction:
Change Link component for a tag in LinkedIn logo
external links require an a tag to work correctly
## Code After:
//@flow
import React from "react";
import { pure, compose, withState, withHandlers } from "recompose";
import { ThemeProvider } from "styled-components";
import mainTheme from "../../global/style/mainTheme";
import { Logo } from "./Logo";
const enhance = compose(
withState("isActive", "setActive", false),
withHandlers({
addActive: props => () => props.setActive(true),
rmActive: props => () => props.setActive(false)
}),
pure
);
export const LinkedInLogo = enhance(
({
addActive,
rmActive,
isActive,
...props
}: {
addActive: () => any,
rmActive: () => any,
isActive: boolean
}) => (
<ThemeProvider theme={mainTheme}>
<a
href="https://www.linkedin.com/in/oliver-askew-5791a333/"
onMouseEnter={addActive}
onMouseLeave={rmActive}
>
<Logo isActive={isActive} />
</a>
</ThemeProvider>
)
);
|
2272649aee64c2f35651dbea5acd2dc3c69a1024 | cfg/layout/html:account:login.php | cfg/layout/html:account:login.php | <?php
declare(strict_types=1);
return [
'toolbar' => [
'active' => false,
],
'title' => [
'parent_id' => 'content',
'sort' => 100,
'cfg' => [
'text' => 'Login',
],
],
'main-content' => [
'type' => 'login',
'parent_id' => 'content',
'sort' => 300,
],
];
| <?php
declare(strict_types=1);
return [
'toolbar' => [
'active' => false,
],
'header' => [
'active' => false,
],
'title' => [
'parent_id' => 'content',
'sort' => 100,
'cfg' => [
'text' => 'Login',
],
],
'main-content' => [
'type' => 'login',
'parent_id' => 'content',
'sort' => 300,
],
];
| Disable header on login page | Disable header on login page
| PHP | mit | akilli/cms,akilli/cms,akilli/cms | php | ## Code Before:
<?php
declare(strict_types=1);
return [
'toolbar' => [
'active' => false,
],
'title' => [
'parent_id' => 'content',
'sort' => 100,
'cfg' => [
'text' => 'Login',
],
],
'main-content' => [
'type' => 'login',
'parent_id' => 'content',
'sort' => 300,
],
];
## Instruction:
Disable header on login page
## Code After:
<?php
declare(strict_types=1);
return [
'toolbar' => [
'active' => false,
],
'header' => [
'active' => false,
],
'title' => [
'parent_id' => 'content',
'sort' => 100,
'cfg' => [
'text' => 'Login',
],
],
'main-content' => [
'type' => 'login',
'parent_id' => 'content',
'sort' => 300,
],
];
|
868f210fe9250102e35b801654380c3a90f481c2 | CONTRIBUTING.md | CONTRIBUTING.md | Contributing
============
We welcome contributions in the form of GitHub issues or pull requests. A [guide to contributing via GitHub][guide] is available.
The documents are licensed under the [Creative Commons CC0 1.0 license][CC0]. Please review CC0 1.0 and agree to license your contributions under the same terms.
The project's [README file][readme] contains a legal disclaimer that speaks on behalf of all contributors. Please review the disclaimer and confirm your agreement with it.
[guide]: http://www.seriesseed.com/posts/2013/02/for-law-nerds-and-real-nerds.html
[CC0]: http://creativecommons.org/publicdomain/zero/1.0/
[readme]: ./README.md
| Contributing
============
We welcome contributions in the form of GitHub issues or pull requests. A [guide to contributing via GitHub][guide] is available.
License and Disclaimer
----------------------
The documents are licensed under the [Creative Commons CC0 1.0 license][CC0]. Please review CC0 1.0 and agree to license your contributions under the same terms.
The project's [README file][readme] contains a legal disclaimer that speaks on behalf of all contributors. Please review the disclaimer and confirm your agreement with it.
Markdown Style Guide
--------------------
Please conform to the overall style of the documents with respect to formatting using [Markdown](https://help.github.com/articles/markdown-basics/).
The title of each document is written as a first-level, setext-style heading:
```markdown
Preferred Stock Investment Agreement
====================================
```
Numbered sections with headings include full stops after both number and summary:
```markdown
### 1.2. Closing; Delivery.
```
Terms should are set in bold where they are defined:
```markdown
... (this "**Agreement**") ...
```
Fill-in-the-blank labels are italicized and set in braces:
```markdown
... _{corporation name}_ ...
```
Instructions to users of the forms are italicized:
```markdown
_Optionally include, "provided, however, ..."_
```
[guide]: http://www.seriesseed.com/posts/2013/02/for-law-nerds-and-real-nerds.html
[CC0]: http://creativecommons.org/publicdomain/zero/1.0/
[readme]: ./README.md
| Add Markdown formatting to contributing guidelines | Add Markdown formatting to contributing guidelines
| Markdown | cc0-1.0 | seriesnext/seriesnext,seriesnext/seriesnext | markdown | ## Code Before:
Contributing
============
We welcome contributions in the form of GitHub issues or pull requests. A [guide to contributing via GitHub][guide] is available.
The documents are licensed under the [Creative Commons CC0 1.0 license][CC0]. Please review CC0 1.0 and agree to license your contributions under the same terms.
The project's [README file][readme] contains a legal disclaimer that speaks on behalf of all contributors. Please review the disclaimer and confirm your agreement with it.
[guide]: http://www.seriesseed.com/posts/2013/02/for-law-nerds-and-real-nerds.html
[CC0]: http://creativecommons.org/publicdomain/zero/1.0/
[readme]: ./README.md
## Instruction:
Add Markdown formatting to contributing guidelines
## Code After:
Contributing
============
We welcome contributions in the form of GitHub issues or pull requests. A [guide to contributing via GitHub][guide] is available.
License and Disclaimer
----------------------
The documents are licensed under the [Creative Commons CC0 1.0 license][CC0]. Please review CC0 1.0 and agree to license your contributions under the same terms.
The project's [README file][readme] contains a legal disclaimer that speaks on behalf of all contributors. Please review the disclaimer and confirm your agreement with it.
Markdown Style Guide
--------------------
Please conform to the overall style of the documents with respect to formatting using [Markdown](https://help.github.com/articles/markdown-basics/).
The title of each document is written as a first-level, setext-style heading:
```markdown
Preferred Stock Investment Agreement
====================================
```
Numbered sections with headings include full stops after both number and summary:
```markdown
### 1.2. Closing; Delivery.
```
Terms should are set in bold where they are defined:
```markdown
... (this "**Agreement**") ...
```
Fill-in-the-blank labels are italicized and set in braces:
```markdown
... _{corporation name}_ ...
```
Instructions to users of the forms are italicized:
```markdown
_Optionally include, "provided, however, ..."_
```
[guide]: http://www.seriesseed.com/posts/2013/02/for-law-nerds-and-real-nerds.html
[CC0]: http://creativecommons.org/publicdomain/zero/1.0/
[readme]: ./README.md
|
b2cdf3495e3a87572695baec68a015fb7e38a714 | core/app/controllers/users/passwords_controller.rb | core/app/controllers/users/passwords_controller.rb | class Users::PasswordsController < Devise::PasswordsController
layout "one_column"
def edit
if params[:msg] # Warning! Also used for logging in unconfirmed users who request new password
@user = User.where(reset_password_token: params[:reset_password_token].to_s).first
if @user
@user.skip_confirmation!
@user.save(validate: false)
sign_in @user, bypass: true
redirect_to after_sign_in_path_for(@user).to_s
else
redirect_to new_user_session_path,
notice: 'Your account is already set up. Please log in to continue.'
end
else
super
end
end
end
| class Users::PasswordsController < Devise::PasswordsController
layout "one_column"
end
| Remove old approval flow, just have people use the reset password functionality | Remove old approval flow, just have people use the reset password functionality
| Ruby | mit | daukantas/factlink-core,daukantas/factlink-core,Factlink/factlink-core,Factlink/factlink-core,Factlink/factlink-core,daukantas/factlink-core,Factlink/factlink-core,daukantas/factlink-core | ruby | ## Code Before:
class Users::PasswordsController < Devise::PasswordsController
layout "one_column"
def edit
if params[:msg] # Warning! Also used for logging in unconfirmed users who request new password
@user = User.where(reset_password_token: params[:reset_password_token].to_s).first
if @user
@user.skip_confirmation!
@user.save(validate: false)
sign_in @user, bypass: true
redirect_to after_sign_in_path_for(@user).to_s
else
redirect_to new_user_session_path,
notice: 'Your account is already set up. Please log in to continue.'
end
else
super
end
end
end
## Instruction:
Remove old approval flow, just have people use the reset password functionality
## Code After:
class Users::PasswordsController < Devise::PasswordsController
layout "one_column"
end
|
c7608162a4dfc500c04475cf45583f0e358d0037 | packages/webdriverio/src/index.ts | packages/webdriverio/src/index.ts | import { Browser } from 'mugshot';
/**
* API adapter for WebdriverIO to make working with it saner.
*/
export default class WebdriverIOAdapter implements Browser {
private browser: WebDriver.ClientAsync & WebdriverIOAsync.Browser;
constructor(browser: WebDriver.ClientAsync & WebdriverIOAsync.Browser) {
this.browser = browser;
}
takeScreenshot = async () => this.browser.takeScreenshot();
getElementRect = async (selector: string) => {
// @ts-ignore because the return type is not properly inferred
const rect: DOMRect = await this.browser.execute(
WebdriverIOAdapter.getBoundingRect,
selector
);
return {
x: rect.x,
y: rect.y,
width: rect.width,
height: rect.height
};
};
private static getBoundingRect(selector: string): DOMRect {
// @ts-ignore because querySelector can be null and we don't
// care about browsers that don't support it.
return document.querySelector(selector).getBoundingClientRect();
}
}
| import { Browser } from 'mugshot';
/* istanbul ignore next because this will get stringified and sent to the browser */
function getBoundingRect(selector: string): DOMRect {
// @ts-ignore because querySelector can be null and we don't
// care about browsers that don't support it.
return document.querySelector(selector).getBoundingClientRect();
}
/**
* API adapter for WebdriverIO to make working with it saner.
*/
export default class WebdriverIOAdapter implements Browser {
private browser: WebDriver.ClientAsync & WebdriverIOAsync.Browser;
constructor(browser: WebDriver.ClientAsync & WebdriverIOAsync.Browser) {
this.browser = browser;
}
takeScreenshot = async () => this.browser.takeScreenshot();
getElementRect = async (selector: string) => {
// @ts-ignore because the return type is not properly inferred
const rect: DOMRect = await this.browser.execute(
getBoundingRect,
selector
);
return {
x: rect.x,
y: rect.y,
width: rect.width,
height: rect.height
};
};
}
| Fix stringified function breaking under istanbul | Fix stringified function breaking under istanbul
| TypeScript | mit | uberVU/mugshot,uberVU/mugshot | typescript | ## Code Before:
import { Browser } from 'mugshot';
/**
* API adapter for WebdriverIO to make working with it saner.
*/
export default class WebdriverIOAdapter implements Browser {
private browser: WebDriver.ClientAsync & WebdriverIOAsync.Browser;
constructor(browser: WebDriver.ClientAsync & WebdriverIOAsync.Browser) {
this.browser = browser;
}
takeScreenshot = async () => this.browser.takeScreenshot();
getElementRect = async (selector: string) => {
// @ts-ignore because the return type is not properly inferred
const rect: DOMRect = await this.browser.execute(
WebdriverIOAdapter.getBoundingRect,
selector
);
return {
x: rect.x,
y: rect.y,
width: rect.width,
height: rect.height
};
};
private static getBoundingRect(selector: string): DOMRect {
// @ts-ignore because querySelector can be null and we don't
// care about browsers that don't support it.
return document.querySelector(selector).getBoundingClientRect();
}
}
## Instruction:
Fix stringified function breaking under istanbul
## Code After:
import { Browser } from 'mugshot';
/* istanbul ignore next because this will get stringified and sent to the browser */
function getBoundingRect(selector: string): DOMRect {
// @ts-ignore because querySelector can be null and we don't
// care about browsers that don't support it.
return document.querySelector(selector).getBoundingClientRect();
}
/**
* API adapter for WebdriverIO to make working with it saner.
*/
export default class WebdriverIOAdapter implements Browser {
private browser: WebDriver.ClientAsync & WebdriverIOAsync.Browser;
constructor(browser: WebDriver.ClientAsync & WebdriverIOAsync.Browser) {
this.browser = browser;
}
takeScreenshot = async () => this.browser.takeScreenshot();
getElementRect = async (selector: string) => {
// @ts-ignore because the return type is not properly inferred
const rect: DOMRect = await this.browser.execute(
getBoundingRect,
selector
);
return {
x: rect.x,
y: rect.y,
width: rect.width,
height: rect.height
};
};
}
|
42a2254ad30a0edd3ae3f82deef3cae717bedfcc | .travis.yml | .travis.yml | language: php
php:
- '7.1'
- '7.2'
- '7.3'
- '7.4snapshot'
- nightly
matrix:
allow_failures:
- php: nightly
- php: '7.4snapshot'
install:
- composer update
script:
- vendor/bin/phpunit --coverage-clover build/coverage/xml
after_script:
- php vendor/bin/codacycoverage clover build/coverage/xml | language: php
php:
- '7.1'
- '7.2'
- '7.3'
- '7.4snapshot'
- nightly
matrix:
allow_failures:
- php: nightly
- php: '7.4snapshot'
install:
- composer update
script:
- if [[ $(phpenv version-name) != '7.1' && $(phpenv version-name) != '7.2' ]] ; then vendor/bin/phpunit --coverage-clover build/coverage/xml ; fi
after_script:
- if [[ $(phpenv version-name) != '7.1' && $(phpenv version-name) != '7.2' ]] ; then vendor/bin/codacycoverage clover build/coverage/xml ; fi | Fix Travis, Xdebug old version coverage test. Coverage now only run on PHP >=7.3. | Fix Travis, Xdebug old version coverage test. Coverage now only run on PHP >=7.3.
| YAML | mit | julien-boudry/Condorcet | yaml | ## Code Before:
language: php
php:
- '7.1'
- '7.2'
- '7.3'
- '7.4snapshot'
- nightly
matrix:
allow_failures:
- php: nightly
- php: '7.4snapshot'
install:
- composer update
script:
- vendor/bin/phpunit --coverage-clover build/coverage/xml
after_script:
- php vendor/bin/codacycoverage clover build/coverage/xml
## Instruction:
Fix Travis, Xdebug old version coverage test. Coverage now only run on PHP >=7.3.
## Code After:
language: php
php:
- '7.1'
- '7.2'
- '7.3'
- '7.4snapshot'
- nightly
matrix:
allow_failures:
- php: nightly
- php: '7.4snapshot'
install:
- composer update
script:
- if [[ $(phpenv version-name) != '7.1' && $(phpenv version-name) != '7.2' ]] ; then vendor/bin/phpunit --coverage-clover build/coverage/xml ; fi
after_script:
- if [[ $(phpenv version-name) != '7.1' && $(phpenv version-name) != '7.2' ]] ; then vendor/bin/codacycoverage clover build/coverage/xml ; fi |
59373060d8d65384f6b78e5acf18a3ebff543045 | _setup/utils/resolve-static-triggers.js | _setup/utils/resolve-static-triggers.js | 'use strict';
var resolve = require('esniff/accessed-properties')('this')
, memoize = require('memoizee/plain')
, ignored = require('./meta-property-names')
, re = new RegExp('^\\s*function\\s*(?:[\\0-\'\\)-\\uffff]+)*\\s*\\(\\s*' +
'(_observe)?[\\/*\\s]*\\)\\s*\\{([\\0-\\uffff]*)\\}\\s*$')
, isFn = RegExp.prototype.test.bind(/^\s*\(/);
module.exports = memoize(function (fn) {
var body = String(fn).match(re)[2], shift = 0;
resolve(body).forEach(function (data) {
var name = data.name, start;
if (name[0] === '_') return;
if (ignored[name]) return;
if (isFn(body.slice(data.end + shift))) return;
start = data.start - 5 + shift;
body = body.slice(0, start) + '_observe(this._get(\'' + name + '\'))' +
body.slice(data.end + shift);
shift += 18;
});
if (!shift) return fn;
return new Function('_observe', body);
});
| 'use strict';
var resolve = require('esniff/accessed-properties')('this')
, memoize = require('memoizee/plain')
, ignored = require('./meta-property-names')
, re = new RegExp('^\\s*function\\s*(?:[\\0-\'\\)-\\uffff]+)*\\s*\\(\\s*' +
'(_observe)?[\\/*\\s]*\\)\\s*\\{([\\0-\\uffff]*)\\}\\s*$')
, isFn = RegExp.prototype.test.bind(/^\s*\(/);
module.exports = memoize(function (fn) {
var body = String(fn).match(re)[2], shift = 0;
resolve(body).forEach(function (data) {
var name = data.name, start;
if (name[0] === '_') return;
if (ignored[name]) return;
if (isFn(body.slice(data.end + shift))) return;
start = data.start - 5 + shift;
body = body.slice(0, start) + '_observe(this._get(\'' + name + '\'))' +
body.slice(data.end + shift);
shift += 18;
});
if (!shift) return fn;
body = 'try {\n' + body +
'\n;} catch (e) { throw new Error("Dbjs getter error:\\n\\n" + e.stack + ' +
'"\\n\\nGetter Body:\\n' + JSON.stringify(body).slice(1) + '); }';
return new Function('_observe', body);
});
| Introduce better error reporting for getter errors | Introduce better error reporting for getter errors
| JavaScript | mit | medikoo/dbjs | javascript | ## Code Before:
'use strict';
var resolve = require('esniff/accessed-properties')('this')
, memoize = require('memoizee/plain')
, ignored = require('./meta-property-names')
, re = new RegExp('^\\s*function\\s*(?:[\\0-\'\\)-\\uffff]+)*\\s*\\(\\s*' +
'(_observe)?[\\/*\\s]*\\)\\s*\\{([\\0-\\uffff]*)\\}\\s*$')
, isFn = RegExp.prototype.test.bind(/^\s*\(/);
module.exports = memoize(function (fn) {
var body = String(fn).match(re)[2], shift = 0;
resolve(body).forEach(function (data) {
var name = data.name, start;
if (name[0] === '_') return;
if (ignored[name]) return;
if (isFn(body.slice(data.end + shift))) return;
start = data.start - 5 + shift;
body = body.slice(0, start) + '_observe(this._get(\'' + name + '\'))' +
body.slice(data.end + shift);
shift += 18;
});
if (!shift) return fn;
return new Function('_observe', body);
});
## Instruction:
Introduce better error reporting for getter errors
## Code After:
'use strict';
var resolve = require('esniff/accessed-properties')('this')
, memoize = require('memoizee/plain')
, ignored = require('./meta-property-names')
, re = new RegExp('^\\s*function\\s*(?:[\\0-\'\\)-\\uffff]+)*\\s*\\(\\s*' +
'(_observe)?[\\/*\\s]*\\)\\s*\\{([\\0-\\uffff]*)\\}\\s*$')
, isFn = RegExp.prototype.test.bind(/^\s*\(/);
module.exports = memoize(function (fn) {
var body = String(fn).match(re)[2], shift = 0;
resolve(body).forEach(function (data) {
var name = data.name, start;
if (name[0] === '_') return;
if (ignored[name]) return;
if (isFn(body.slice(data.end + shift))) return;
start = data.start - 5 + shift;
body = body.slice(0, start) + '_observe(this._get(\'' + name + '\'))' +
body.slice(data.end + shift);
shift += 18;
});
if (!shift) return fn;
body = 'try {\n' + body +
'\n;} catch (e) { throw new Error("Dbjs getter error:\\n\\n" + e.stack + ' +
'"\\n\\nGetter Body:\\n' + JSON.stringify(body).slice(1) + '); }';
return new Function('_observe', body);
});
|
97e69624bf6f2e5ff7e5bbe199136588a613d282 | src/Consul/ConsulDns.php | src/Consul/ConsulDns.php | <?php
namespace CascadeEnergy\ServiceDiscovery\Consul;
use CascadeEnergy\ServiceDiscovery\ServiceDiscoveryClientInterface;
class ConsulDns implements ServiceDiscoveryClientInterface
{
private $lookupService;
public function __construct(callable $lookupService = null)
{
$this->lookupService = $lookupService;
if (is_null($lookupService)) {
$this->lookupService = "dns_get_record";
}
}
public function getServiceAddress($serviceName, $version = null)
{
$dnsEntry = "$serviceName.service.consul";
if (!empty($version)) {
$dnsEntry = "$version.$dnsEntry";
}
$resultList = call_user_func($this->lookupService, $dnsEntry, DNS_SRV);
if (empty($resultList)) {
return false;
}
$ipAddress = '';
$port = '80';
foreach ($resultList as $result) {
if (isset($result['host'])) {
$ipAddress = $result['host'];
}
if (isset($result['port'])) {
$port = $result['port'];
}
}
return "$ipAddress:$port";
}
}
| <?php
namespace CascadeEnergy\ServiceDiscovery\Consul;
use CascadeEnergy\ServiceDiscovery\ServiceDiscoveryClientInterface;
class ConsulDns implements ServiceDiscoveryClientInterface
{
private $lookupService;
public function __construct(callable $lookupService = null)
{
$this->lookupService = $lookupService;
if (is_null($lookupService)) {
$this->lookupService = "dns_get_record";
}
}
public function getServiceAddress($serviceName, $version = null)
{
$dnsEntry = "$serviceName.service.consul";
if (!empty($version)) {
$dnsEntry = "$version.$dnsEntry";
}
$resultList = call_user_func($this->lookupService, $dnsEntry, DNS_SRV);
if (empty($resultList)) {
return false;
}
$ipAddress = '';
$port = '80';
foreach ($resultList as $result) {
if (isset($result['target'])) {
$ipAddress = $result['target'];
}
if (isset($result['port'])) {
$port = $result['port'];
}
}
return "$ipAddress:$port";
}
}
| Fix ip -> target bug | Fix ip -> target bug
| PHP | mit | CascadeEnergy/php-consul-client | php | ## Code Before:
<?php
namespace CascadeEnergy\ServiceDiscovery\Consul;
use CascadeEnergy\ServiceDiscovery\ServiceDiscoveryClientInterface;
class ConsulDns implements ServiceDiscoveryClientInterface
{
private $lookupService;
public function __construct(callable $lookupService = null)
{
$this->lookupService = $lookupService;
if (is_null($lookupService)) {
$this->lookupService = "dns_get_record";
}
}
public function getServiceAddress($serviceName, $version = null)
{
$dnsEntry = "$serviceName.service.consul";
if (!empty($version)) {
$dnsEntry = "$version.$dnsEntry";
}
$resultList = call_user_func($this->lookupService, $dnsEntry, DNS_SRV);
if (empty($resultList)) {
return false;
}
$ipAddress = '';
$port = '80';
foreach ($resultList as $result) {
if (isset($result['host'])) {
$ipAddress = $result['host'];
}
if (isset($result['port'])) {
$port = $result['port'];
}
}
return "$ipAddress:$port";
}
}
## Instruction:
Fix ip -> target bug
## Code After:
<?php
namespace CascadeEnergy\ServiceDiscovery\Consul;
use CascadeEnergy\ServiceDiscovery\ServiceDiscoveryClientInterface;
class ConsulDns implements ServiceDiscoveryClientInterface
{
private $lookupService;
public function __construct(callable $lookupService = null)
{
$this->lookupService = $lookupService;
if (is_null($lookupService)) {
$this->lookupService = "dns_get_record";
}
}
public function getServiceAddress($serviceName, $version = null)
{
$dnsEntry = "$serviceName.service.consul";
if (!empty($version)) {
$dnsEntry = "$version.$dnsEntry";
}
$resultList = call_user_func($this->lookupService, $dnsEntry, DNS_SRV);
if (empty($resultList)) {
return false;
}
$ipAddress = '';
$port = '80';
foreach ($resultList as $result) {
if (isset($result['target'])) {
$ipAddress = $result['target'];
}
if (isset($result['port'])) {
$port = $result['port'];
}
}
return "$ipAddress:$port";
}
}
|
67d60965fe518c4ffc791e9561be44569aa94f52 | project.properties | project.properties | target=android-21
android.library.reference.1=..\\..\\workspace\\cardview
android.library.reference.2=../../workspace/android-support-v7-appcompat
| proguard.config=${sdk.dir}/tools/proguard/proguard-android.txt:proguard-project.txt
# Project target.
target=android-21
android.library.reference.1=..\\..\\workspace\\cardview
android.library.reference.2=../../workspace/android-support-v7-appcompat
| Enable ProGuard to strip out unnecessary code from the support library | Enable ProGuard to strip out unnecessary code from the support library | INI | apache-2.0 | blunden/haveibeenpwned | ini | ## Code Before:
target=android-21
android.library.reference.1=..\\..\\workspace\\cardview
android.library.reference.2=../../workspace/android-support-v7-appcompat
## Instruction:
Enable ProGuard to strip out unnecessary code from the support library
## Code After:
proguard.config=${sdk.dir}/tools/proguard/proguard-android.txt:proguard-project.txt
# Project target.
target=android-21
android.library.reference.1=..\\..\\workspace\\cardview
android.library.reference.2=../../workspace/android-support-v7-appcompat
|
ec69bffd1e732a55a3211cfd65f3bef6aed33a39 | src/validate.rs | src/validate.rs | /// Validation provider
pub trait Validator {
fn is_valid(&self, line: &str) -> bool;
}
impl Validator for () {
fn is_valid(&self, _line: &str) -> bool {
true
}
}
| //! Input buffer validation API (Multi-line editing)
/// This trait provides an extension interface for determining whether
/// the current input buffer is valid. Rustyline uses the method
/// provided by this trait to decide whether hitting the enter key
/// will end the current editing session and return the current line
/// buffer to the caller of `Editor::readline` or variants.
pub trait Validator {
/// Takes the currently edited `line` and returns a bool
/// indicating whether it is valid or not. The most common
/// validity check to implement is probably whether the input is
/// complete or not, for instance ensuring that all delimiters are
/// fully balanced.
///
/// If you implement more complex validation checks it's probably
/// a good idea to also implement a `Hinter` to provide feedback
/// about what is invalid.
#[allow(unused_variables)]
fn is_valid(&self, line: &str) -> bool {
true
}
}
impl Validator for () {}
| Improve documentation for new Validation trait | Improve documentation for new Validation trait
| Rust | mit | kkawakam/rustyline | rust | ## Code Before:
/// Validation provider
pub trait Validator {
fn is_valid(&self, line: &str) -> bool;
}
impl Validator for () {
fn is_valid(&self, _line: &str) -> bool {
true
}
}
## Instruction:
Improve documentation for new Validation trait
## Code After:
//! Input buffer validation API (Multi-line editing)
/// This trait provides an extension interface for determining whether
/// the current input buffer is valid. Rustyline uses the method
/// provided by this trait to decide whether hitting the enter key
/// will end the current editing session and return the current line
/// buffer to the caller of `Editor::readline` or variants.
pub trait Validator {
/// Takes the currently edited `line` and returns a bool
/// indicating whether it is valid or not. The most common
/// validity check to implement is probably whether the input is
/// complete or not, for instance ensuring that all delimiters are
/// fully balanced.
///
/// If you implement more complex validation checks it's probably
/// a good idea to also implement a `Hinter` to provide feedback
/// about what is invalid.
#[allow(unused_variables)]
fn is_valid(&self, line: &str) -> bool {
true
}
}
impl Validator for () {}
|
80ccc5ef130979ddfbf4dfa142356c4f48e2ae0d | README.md | README.md |
Pliny is a template Sinatra app to implement postgres-backed APIs.
|
Pliny is a template Sinatra app to implement postgres-backed APIs.
```
bundle install
createdb pliny-development
foreman start web
``
| Add setup instructions for Pliny | Add setup instructions for Pliny
| Markdown | mit | uhoh-itsmaciek/transferatu,uhoh-itsmaciek/transferatu,neilmiddleton/pliny-test,heroku/pgperf,heroku/pgperf,ukd1/transferatu,ukd1/transferatu | markdown | ## Code Before:
Pliny is a template Sinatra app to implement postgres-backed APIs.
## Instruction:
Add setup instructions for Pliny
## Code After:
Pliny is a template Sinatra app to implement postgres-backed APIs.
```
bundle install
createdb pliny-development
foreman start web
``
|
7ad3de734a21e8a42d7f6993da684dff5bcfb67c | tox-requirements.txt | tox-requirements.txt | tox < 3.0.0; python_version == '3.3'
tox; python_version != '3.3'
virtualenv == 15.2.0; python_version == '3.3'
| tox < 3.0.0; python_version == '3.3'
tox < 4.0.0; implementation_name == 'pypy'
tox; python_version != '3.3' and implementation_name != 'pypy'
virtualenv == 15.2.0; python_version == '3.3'
| Use older tox for the pypy python | Use older tox for the pypy python
| Text | bsd-3-clause | sjagoe/haas,sjagoe/haas,itziakos/haas,scalative/haas,scalative/haas,itziakos/haas | text | ## Code Before:
tox < 3.0.0; python_version == '3.3'
tox; python_version != '3.3'
virtualenv == 15.2.0; python_version == '3.3'
## Instruction:
Use older tox for the pypy python
## Code After:
tox < 3.0.0; python_version == '3.3'
tox < 4.0.0; implementation_name == 'pypy'
tox; python_version != '3.3' and implementation_name != 'pypy'
virtualenv == 15.2.0; python_version == '3.3'
|
c8978063a6e6fab40a58774610140e22e1621b98 | .travis.yml | .travis.yml | language: rust
rust:
- stable
- beta
- nightly
sudo: false
addons:
apt:
packages:
- libgl1-mesa-dev
os:
- linux
- osx
| language: rust
rust:
- stable
- beta
- nightly
sudo: false
addons:
apt:
packages:
- libgl1-mesa-dev
os:
- linux
- osx
matrix:
allow_failures:
- rust: nightly
- rust: beta
| Allow nightly and beta to fail | Allow nightly and beta to fail
| YAML | apache-2.0 | gltf-rs/gltf,warmwaffles/gltf | yaml | ## Code Before:
language: rust
rust:
- stable
- beta
- nightly
sudo: false
addons:
apt:
packages:
- libgl1-mesa-dev
os:
- linux
- osx
## Instruction:
Allow nightly and beta to fail
## Code After:
language: rust
rust:
- stable
- beta
- nightly
sudo: false
addons:
apt:
packages:
- libgl1-mesa-dev
os:
- linux
- osx
matrix:
allow_failures:
- rust: nightly
- rust: beta
|
d0f0ec5be3d318f5abfcd1c72846cbb849ecf8fd | .config/fish/functions/fs.fish | .config/fish/functions/fs.fish | function fs --argument-names function_name
set -l dest_file ~/.config/fish/functions/$function_name.fish
set -l original_contents
set -l comment_lines
if test -e $dest_file
set original_contents (cat $dest_file)
for line in $original_contents
# Collect lines starting with a hash
if test '#' = (string sub --length 1 $line)
set comment_lines $comment_lines $line
end
end
# echo comments:\n$comment_lines
end
funcsave $function_name
if test -n $comment_lines
# Read in the new function definition
set -l new_contents (cat $dest_file)
# Wipe out the file with the original comments
echo -n $comment_lines\n >$dest_file
# Filter the new content to fix indentation
for line in $new_contents
set -l first_char (string sub --length 1 $line)
set -l all_after_first_char (string sub --start 1 $line)
if test \t = $first_char
# Skip the tab on the first line of the function body
# Somehow the \t takes up 2 chars?
echo " "(string sub --start 2 $line) >>$dest_file
else
echo $all_after_first_char >>$dest_file
end
end
# echo "Comments restored"
end
cat $dest_file
end
| function fs --argument function_name
set -l dest_file ~/.config/fish/functions/$function_name.fish
set -l original_contents
set -l comment_lines
if test -e $dest_file
set original_contents (cat $dest_file)
for line in $original_contents
# Collect lines starting with a hash
if test '#' = (string sub --length 1 $line)
set comment_lines $comment_lines $line
end
end
end
funcsave $function_name
if test -n $comment_lines
# Read in the new function definition
set -l new_contents (cat $dest_file)
# Wipe out the file with the original comments
echo -n $comment_lines\n >$dest_file
echo -n $new_contents\n >>$dest_file
end
# Fix the funky indentation (tab on first line of body, extra space on all)
fish_indent --write $dest_file
cat $dest_file
end
| Use fish_indent to fix funcsave output | Use fish_indent to fix funcsave output
| fish | mit | phatblat/dotfiles,phatblat/dotfiles,phatblat/dotfiles,phatblat/dotfiles,phatblat/dotfiles,phatblat/dotfiles | fish | ## Code Before:
function fs --argument-names function_name
set -l dest_file ~/.config/fish/functions/$function_name.fish
set -l original_contents
set -l comment_lines
if test -e $dest_file
set original_contents (cat $dest_file)
for line in $original_contents
# Collect lines starting with a hash
if test '#' = (string sub --length 1 $line)
set comment_lines $comment_lines $line
end
end
# echo comments:\n$comment_lines
end
funcsave $function_name
if test -n $comment_lines
# Read in the new function definition
set -l new_contents (cat $dest_file)
# Wipe out the file with the original comments
echo -n $comment_lines\n >$dest_file
# Filter the new content to fix indentation
for line in $new_contents
set -l first_char (string sub --length 1 $line)
set -l all_after_first_char (string sub --start 1 $line)
if test \t = $first_char
# Skip the tab on the first line of the function body
# Somehow the \t takes up 2 chars?
echo " "(string sub --start 2 $line) >>$dest_file
else
echo $all_after_first_char >>$dest_file
end
end
# echo "Comments restored"
end
cat $dest_file
end
## Instruction:
Use fish_indent to fix funcsave output
## Code After:
function fs --argument function_name
set -l dest_file ~/.config/fish/functions/$function_name.fish
set -l original_contents
set -l comment_lines
if test -e $dest_file
set original_contents (cat $dest_file)
for line in $original_contents
# Collect lines starting with a hash
if test '#' = (string sub --length 1 $line)
set comment_lines $comment_lines $line
end
end
end
funcsave $function_name
if test -n $comment_lines
# Read in the new function definition
set -l new_contents (cat $dest_file)
# Wipe out the file with the original comments
echo -n $comment_lines\n >$dest_file
echo -n $new_contents\n >>$dest_file
end
# Fix the funky indentation (tab on first line of body, extra space on all)
fish_indent --write $dest_file
cat $dest_file
end
|
7cda51f02ad3a423974ec837c28480d75ef469f2 | features/chart_of_accounts/organization_review.feature | features/chart_of_accounts/organization_review.feature | Feature: Organization Review
[KFSQA-583] As a KFS Chart Administrator I want to copy an Organization without getting an error.
[KFSQA-584] As a KFS Chart Manager, the Organization Review document should route to final.
@KFSQA-583 @Bug @OrgMaint @KFSMI-9622 @hare @solid
Scenario: Select an Organization Review to get to the Organization Review screen
Given I am logged in as a KFS Chart Manager
And I edit an Organization Review
Then the Organization Review document goes to SAVED
@KFSQA-584 @Bug @Routing @OrgReview @KFSMI-10435 @sloth @solid
Scenario: Create an Organization Review, Blanket Approve it, have it go to Final.
Given I am logged in as a KFS Chart Manager
And I save an Organization Review document
When I blanket approve the Organization Review document
Then the Organization Review document goes to FINAL
| Feature: Organization Review
[KFSQA-583] As a KFS Chart Manager, I want to edit, copy, or create delegation for an Organization Review
without getting an Incident Report.
[KFSQA-584] As a KFS Chart Manager, the Organization Review document should route to final.
@KFSQA-583 @Bug @OrgMaint @KFSMI-9622 @hare @solid
Scenario: When the KFS Chart Manager edits and submits an Organization Review, the document routes to Final.
Given I am logged in as a KFS Chart Manager
And I edit the Active To Date on a random Organization Review to today's date
When I submit the Organization Review document
And the document should have no errors
Then the Organization Review document goes to FINAL
@KFSQA-583 @Bug @OrgMaint @KFSMI-9622 @hare @solid
Scenario: When the KFS Chart Manager copies and submits an Organization Review, the document routes to Final.
Given I am logged in as a KFS Chart Manager
And I copy a random Organization Review changing Organization Code to a random value
When I submit the Organization Review document
And the document should have no errors
Then the Organization Review document goes to FINAL
@KFSQA-583 @Bug @OrgMaint @KFSMI-9622 @hare @solid
Scenario: When the KFS Chart Manager performs a create delegation for an Organization Review and submits
the document, the document routes to Final.
Given I am logged in as a KFS Chart Manager
And I create a primary delegation for a random Organization Review
And I submit the Organization Review document changing the Principal Name to a random KFS User
And the document should have no errors
Then the Organization Review document goes to FINAL
@KFSQA-584 @Bug @Routing @OrgReview @KFSMI-10435 @sloth @solid
Scenario: Create an Organization Review, Blanket Approve it, have it go to Final.
Given I am logged in as a KFS Chart Manager
And I save an Organization Review document
When I blanket approve the Organization Review document
Then the Organization Review document goes to FINAL
| Fix KFSQA-583 and KFSQA-584 false positive tests. Removed caching for global_config methods that looked up the first or a random principal name and continued to use that same value throughout the test even when a different value would be needed. Added fail for the AFT with a more descriptive error message when method KFSDataObject.view does not find an edoc when doc search by document id is performed. This more descriptive error message was need to address the confusing Watir error message that is presently being shown. | [KFSQA-1147] Fix KFSQA-583 and KFSQA-584 false positive tests. Removed caching for global_config methods that looked up the first or a random principal name and continued to use that same value throughout the test even when a different value would be needed. Added fail for the AFT with a more descriptive error message when method KFSDataObject.view does not find an edoc when doc search by document id is performed. This more descriptive error message was need to address the confusing Watir error message that is presently being shown.
| Cucumber | apache-2.0 | CU-CommunityApps/kuality-kfs-cu | cucumber | ## Code Before:
Feature: Organization Review
[KFSQA-583] As a KFS Chart Administrator I want to copy an Organization without getting an error.
[KFSQA-584] As a KFS Chart Manager, the Organization Review document should route to final.
@KFSQA-583 @Bug @OrgMaint @KFSMI-9622 @hare @solid
Scenario: Select an Organization Review to get to the Organization Review screen
Given I am logged in as a KFS Chart Manager
And I edit an Organization Review
Then the Organization Review document goes to SAVED
@KFSQA-584 @Bug @Routing @OrgReview @KFSMI-10435 @sloth @solid
Scenario: Create an Organization Review, Blanket Approve it, have it go to Final.
Given I am logged in as a KFS Chart Manager
And I save an Organization Review document
When I blanket approve the Organization Review document
Then the Organization Review document goes to FINAL
## Instruction:
[KFSQA-1147] Fix KFSQA-583 and KFSQA-584 false positive tests. Removed caching for global_config methods that looked up the first or a random principal name and continued to use that same value throughout the test even when a different value would be needed. Added fail for the AFT with a more descriptive error message when method KFSDataObject.view does not find an edoc when doc search by document id is performed. This more descriptive error message was need to address the confusing Watir error message that is presently being shown.
## Code After:
Feature: Organization Review
[KFSQA-583] As a KFS Chart Manager, I want to edit, copy, or create delegation for an Organization Review
without getting an Incident Report.
[KFSQA-584] As a KFS Chart Manager, the Organization Review document should route to final.
@KFSQA-583 @Bug @OrgMaint @KFSMI-9622 @hare @solid
Scenario: When the KFS Chart Manager edits and submits an Organization Review, the document routes to Final.
Given I am logged in as a KFS Chart Manager
And I edit the Active To Date on a random Organization Review to today's date
When I submit the Organization Review document
And the document should have no errors
Then the Organization Review document goes to FINAL
@KFSQA-583 @Bug @OrgMaint @KFSMI-9622 @hare @solid
Scenario: When the KFS Chart Manager copies and submits an Organization Review, the document routes to Final.
Given I am logged in as a KFS Chart Manager
And I copy a random Organization Review changing Organization Code to a random value
When I submit the Organization Review document
And the document should have no errors
Then the Organization Review document goes to FINAL
@KFSQA-583 @Bug @OrgMaint @KFSMI-9622 @hare @solid
Scenario: When the KFS Chart Manager performs a create delegation for an Organization Review and submits
the document, the document routes to Final.
Given I am logged in as a KFS Chart Manager
And I create a primary delegation for a random Organization Review
And I submit the Organization Review document changing the Principal Name to a random KFS User
And the document should have no errors
Then the Organization Review document goes to FINAL
@KFSQA-584 @Bug @Routing @OrgReview @KFSMI-10435 @sloth @solid
Scenario: Create an Organization Review, Blanket Approve it, have it go to Final.
Given I am logged in as a KFS Chart Manager
And I save an Organization Review document
When I blanket approve the Organization Review document
Then the Organization Review document goes to FINAL
|
55f4dac3d3569fb4b8c35f5e63ead24b97316921 | packages/tux/src/utils/accessors.ts | packages/tux/src/utils/accessors.ts | /**
* Gets the value of a property on an object.
*
* @param {Object} obj The value to be clamped
* @param {String} key The lower boundary of the output range
* @returns {Object} the property value or 'null'.
*/
export function get(obj: any, key: string | string[]): any {
if (key.length === 0 || !obj) {
return obj ? obj : null
}
const parts = _splitKey(key)
const nextLevel = obj[parts[0]]
const restOfKey = parts.slice(1, parts.length).join('.')
return get(nextLevel, restOfKey)
}
export function set(obj: any, key: string | string[], value: any): void {
const parts = _splitKey(key)
if (parts.length === 1) {
obj[parts[0]] = value
} else {
const lastKeyPartIndex = parts.length - 1
const parent = get(obj, parts.slice(0, lastKeyPartIndex))
const lastKeyPart = parts[lastKeyPartIndex]
parent[lastKeyPart] = value
}
}
function _splitKey(key: string | string[]): string[] {
if (key instanceof Array) {
return key
}
return key.split('.')
}
| /**
* Gets the value of a property on an object.
*
* @param {Object} obj The value to be clamped
* @param {String} key The lower boundary of the output range
* @returns {Object} the property value or 'null'.
*/
export function get(obj: any, key: string | string[]): any {
if (key.length === 0 || !obj) {
return obj ? obj : null
}
const parts = _splitKey(key)
const nextLevel = obj[parts[0]]
const restOfKey = parts.slice(1, parts.length).join('.')
return get(nextLevel, restOfKey)
}
export function set(obj: any, key: string | string[], value: any): void {
const parts = _splitKey(key)
if (parts.length === 1) {
obj[parts[0]] = value
} else if (parts.length > 1) {
const lastKeyPartIndex = parts.length - 1
const parent = get(obj, parts.slice(0, lastKeyPartIndex))
const lastKeyPart = parts[lastKeyPartIndex]
parent[lastKeyPart] = value
}
}
function _splitKey(key: string | string[]): string[] {
if (key instanceof Array) {
return key
}
return key.split('.')
}
| Add an empty-key guard around logic in set | Add an empty-key guard around logic in set
| TypeScript | mit | aranja/tux,aranja/tux,aranja/tux | typescript | ## Code Before:
/**
* Gets the value of a property on an object.
*
* @param {Object} obj The value to be clamped
* @param {String} key The lower boundary of the output range
* @returns {Object} the property value or 'null'.
*/
export function get(obj: any, key: string | string[]): any {
if (key.length === 0 || !obj) {
return obj ? obj : null
}
const parts = _splitKey(key)
const nextLevel = obj[parts[0]]
const restOfKey = parts.slice(1, parts.length).join('.')
return get(nextLevel, restOfKey)
}
export function set(obj: any, key: string | string[], value: any): void {
const parts = _splitKey(key)
if (parts.length === 1) {
obj[parts[0]] = value
} else {
const lastKeyPartIndex = parts.length - 1
const parent = get(obj, parts.slice(0, lastKeyPartIndex))
const lastKeyPart = parts[lastKeyPartIndex]
parent[lastKeyPart] = value
}
}
function _splitKey(key: string | string[]): string[] {
if (key instanceof Array) {
return key
}
return key.split('.')
}
## Instruction:
Add an empty-key guard around logic in set
## Code After:
/**
* Gets the value of a property on an object.
*
* @param {Object} obj The value to be clamped
* @param {String} key The lower boundary of the output range
* @returns {Object} the property value or 'null'.
*/
export function get(obj: any, key: string | string[]): any {
if (key.length === 0 || !obj) {
return obj ? obj : null
}
const parts = _splitKey(key)
const nextLevel = obj[parts[0]]
const restOfKey = parts.slice(1, parts.length).join('.')
return get(nextLevel, restOfKey)
}
export function set(obj: any, key: string | string[], value: any): void {
const parts = _splitKey(key)
if (parts.length === 1) {
obj[parts[0]] = value
} else if (parts.length > 1) {
const lastKeyPartIndex = parts.length - 1
const parent = get(obj, parts.slice(0, lastKeyPartIndex))
const lastKeyPart = parts[lastKeyPartIndex]
parent[lastKeyPart] = value
}
}
function _splitKey(key: string | string[]): string[] {
if (key instanceof Array) {
return key
}
return key.split('.')
}
|
03f7e80234d5209aec5fb08110abb7ecfc3e84b5 | FlipConnectSDK/FCConfiguration.swift | FlipConnectSDK/FCConfiguration.swift | //
// FCConfiguration.swift
// FlipConnectSDK
//
// Created by Munir Wanis on 01/08/17.
// Copyright © 2017 Flip Connect. All rights reserved.
//
import Foundation
public class FCConfiguration {
/**
Initialize the necessary configurations to run the SDK
- Parameters:
- apiToken: Token received to use API
- clientID: Your ClientID
- clientSecret: Your Client Secret
- redirectURI: The desired redirect URI (example: some://example)
- fingerPrintID: Used to send information to antifraud
**/
public init(environment: FCEnvironmentEnum, clientID: String, clientSecret: String, redirectURI: String, fingerPrintID: String? = nil) {
FCConfiguration.environment = environment
self.fingerPrintID = fingerPrintID
self.clientID = clientID
self.clientSecret = clientSecret
self.redirectURI = redirectURI
}
/// Used to send information to antifraud
public var fingerPrintID: String?
/// Your ClientID
public var clientID: String
/// Your Client Secret
public var clientSecret: String
/// The desired redirect URI (example: some://example)
public var redirectURI: String
internal static var environment: FCEnvironmentEnum = .sandbox
}
| //
// FCConfiguration.swift
// FlipConnectSDK
//
// Created by Munir Wanis on 01/08/17.
// Copyright © 2017 Flip Connect. All rights reserved.
//
import Foundation
public class FCConfiguration {
/**
Initialize the necessary configurations to run the SDK
- Parameters:
- apiToken: Token received to use API
- clientID: Your ClientID
- clientSecret: Your Client Secret
- redirectURI: The desired redirect URI (example: some://example)
- fingerPrintID: Used to send information to antifraud
**/
public init(environment: FCEnvironmentEnum, clientID: String, clientSecret: String, redirectURI: String, fingerPrintID: String? = nil) {
FCConfiguration.environment = environment
FCConfiguration.fingerPrintID = fingerPrintID
FCConfiguration.clientID = clientID
FCConfiguration.clientSecret = clientSecret
FCConfiguration.redirectURI = redirectURI
}
/// Used to send information to antifraud
internal static var fingerPrintID: String?
/// Your ClientID
internal static var clientID: String = ""
/// Your Client Secret
internal static var clientSecret: String = ""
/// The desired redirect URI (example: some://example)
internal static var redirectURI: String = ""
internal static var environment: FCEnvironmentEnum = .sandbox
}
| Make Configuration properties static and internal | :art: Make Configuration properties static and internal
| Swift | mit | Flip-Payments/connect-sdk-ios,Flip-Payments/connect-sdk-ios,Flip-Payments/connect-sdk-ios,Flip-Payments/connect-sdk-ios | swift | ## Code Before:
//
// FCConfiguration.swift
// FlipConnectSDK
//
// Created by Munir Wanis on 01/08/17.
// Copyright © 2017 Flip Connect. All rights reserved.
//
import Foundation
public class FCConfiguration {
/**
Initialize the necessary configurations to run the SDK
- Parameters:
- apiToken: Token received to use API
- clientID: Your ClientID
- clientSecret: Your Client Secret
- redirectURI: The desired redirect URI (example: some://example)
- fingerPrintID: Used to send information to antifraud
**/
public init(environment: FCEnvironmentEnum, clientID: String, clientSecret: String, redirectURI: String, fingerPrintID: String? = nil) {
FCConfiguration.environment = environment
self.fingerPrintID = fingerPrintID
self.clientID = clientID
self.clientSecret = clientSecret
self.redirectURI = redirectURI
}
/// Used to send information to antifraud
public var fingerPrintID: String?
/// Your ClientID
public var clientID: String
/// Your Client Secret
public var clientSecret: String
/// The desired redirect URI (example: some://example)
public var redirectURI: String
internal static var environment: FCEnvironmentEnum = .sandbox
}
## Instruction:
:art: Make Configuration properties static and internal
## Code After:
//
// FCConfiguration.swift
// FlipConnectSDK
//
// Created by Munir Wanis on 01/08/17.
// Copyright © 2017 Flip Connect. All rights reserved.
//
import Foundation
public class FCConfiguration {
/**
Initialize the necessary configurations to run the SDK
- Parameters:
- apiToken: Token received to use API
- clientID: Your ClientID
- clientSecret: Your Client Secret
- redirectURI: The desired redirect URI (example: some://example)
- fingerPrintID: Used to send information to antifraud
**/
public init(environment: FCEnvironmentEnum, clientID: String, clientSecret: String, redirectURI: String, fingerPrintID: String? = nil) {
FCConfiguration.environment = environment
FCConfiguration.fingerPrintID = fingerPrintID
FCConfiguration.clientID = clientID
FCConfiguration.clientSecret = clientSecret
FCConfiguration.redirectURI = redirectURI
}
/// Used to send information to antifraud
internal static var fingerPrintID: String?
/// Your ClientID
internal static var clientID: String = ""
/// Your Client Secret
internal static var clientSecret: String = ""
/// The desired redirect URI (example: some://example)
internal static var redirectURI: String = ""
internal static var environment: FCEnvironmentEnum = .sandbox
}
|
8289fe9763e3f0ad18777f193dd1e24bd8cf1ae1 | source/handbook/people-operations/OKR/index.html.md | source/handbook/people-operations/OKR/index.html.md | ---
layout: markdown_page
title: "OKRs and LatticeHQ"
---
Our Objectives & Key Results (OKRs) are all to be found in [LatticeHQ](https://gitlab.latticehq.com).
Use this page to find or add tips, tricks, and guidance on OKRs generally and on how to use LatticeHQ specifically.
### Child Goals vs. Key Results
The key difference between Key Results and Child Goals is ownership: you will own your own Key Results, whereas child goals can be owned by someone else on your team. If you make it a child goal and someone on your team owns it, it'll show up in their personal goals tab, and they'll be responsible for updating it etc. It can still be connected to your goal as a child goal, but you won't be the one owning and updating it.
| ---
layout: markdown_page
title: "OKRs and LatticeHQ"
---
Our Objectives & Key Results (OKRs) are all to be found in [LatticeHQ](https://gitlab.latticehq.com).
Use this page to find or add tips, tricks, and guidance on OKRs generally and on how to use LatticeHQ specifically.
### Child Goals vs. Key Results
The key difference between Key Results and Child Goals is ownership: you will own your own Key Results, whereas child goals can be owned by someone else on your team. If you make it a child goal and someone on your team owns it, it'll show up in their personal goals tab, and they'll be responsible for updating it etc. It can still be connected to your goal as a child goal, but you won't be the one owning and updating it.
### Changing your Name in Lattice
Some employee's do not go by their legal first names. As Lattice defaults to pulling this data from BambooHR, the name field may need to be updated. You can do that here: https://gitlab.latticehq.com/settings/user | Document how to change your name in LatticeHQ | Document how to change your name in LatticeHQ | Markdown | mit | damianhakert/damianhakert.github.io | markdown | ## Code Before:
---
layout: markdown_page
title: "OKRs and LatticeHQ"
---
Our Objectives & Key Results (OKRs) are all to be found in [LatticeHQ](https://gitlab.latticehq.com).
Use this page to find or add tips, tricks, and guidance on OKRs generally and on how to use LatticeHQ specifically.
### Child Goals vs. Key Results
The key difference between Key Results and Child Goals is ownership: you will own your own Key Results, whereas child goals can be owned by someone else on your team. If you make it a child goal and someone on your team owns it, it'll show up in their personal goals tab, and they'll be responsible for updating it etc. It can still be connected to your goal as a child goal, but you won't be the one owning and updating it.
## Instruction:
Document how to change your name in LatticeHQ
## Code After:
---
layout: markdown_page
title: "OKRs and LatticeHQ"
---
Our Objectives & Key Results (OKRs) are all to be found in [LatticeHQ](https://gitlab.latticehq.com).
Use this page to find or add tips, tricks, and guidance on OKRs generally and on how to use LatticeHQ specifically.
### Child Goals vs. Key Results
The key difference between Key Results and Child Goals is ownership: you will own your own Key Results, whereas child goals can be owned by someone else on your team. If you make it a child goal and someone on your team owns it, it'll show up in their personal goals tab, and they'll be responsible for updating it etc. It can still be connected to your goal as a child goal, but you won't be the one owning and updating it.
### Changing your Name in Lattice
Some employee's do not go by their legal first names. As Lattice defaults to pulling this data from BambooHR, the name field may need to be updated. You can do that here: https://gitlab.latticehq.com/settings/user |
e1d649b029930cd0df4625277d37b7abb04fc2db | bufferpool_test.go | bufferpool_test.go | // Copyright 2013 The Bufferpool Authors. All rights reserved.
// Use of this source code is governed by the BSD 2-Clause license,
// which can be found in the LICENSE file.
package bufferpool_test
import (
"bytes"
"fmt"
"testing"
"github.com/pushrax/bufferpool"
)
func TestTakeFromEmpty(t *testing.T) {
bp := bufferpool.New(1, 1)
poolBuf := bp.Take()
if !bytes.Equal(poolBuf.Bytes(), []byte("")) {
t.Fatalf("Buffer from empty bufferpool was allocated incorrectly.")
}
}
func TestTakeFromFilled(t *testing.T) {
bp := bufferpool.New(1, 1)
bp.Give(bytes.NewBuffer([]byte("X")))
reusedBuf := bp.Take()
if !bytes.Equal(reusedBuf.Bytes(), []byte("")) {
t.Fatalf("Buffer from filled bufferpool was recycled incorrectly.")
}
}
func ExampleNew() {
catBuffer := bytes.NewBuffer([]byte("cat"))
bp := bufferpool.New(10, catBuffer.Len())
bp.Give(catBuffer) // An error is returned, but not neccessary to check
reusedBuffer := bp.Take()
reusedBuffer.Write([]byte("dog"))
fmt.Println(reusedBuffer)
// Output:
// dog
}
| // Copyright 2013 The Bufferpool Authors. All rights reserved.
// Use of this source code is governed by the BSD 2-Clause license,
// which can be found in the LICENSE file.
package bufferpool_test
import (
"bytes"
"fmt"
"testing"
"github.com/pushrax/bufferpool"
)
func TestTakeFromEmpty(t *testing.T) {
bp := bufferpool.New(1, 1)
poolBuf := bp.Take()
if !bytes.Equal(poolBuf.Bytes(), []byte("")) {
t.Fatalf("Buffer from empty bufferpool was allocated incorrectly.")
}
}
func TestTakeFromFilled(t *testing.T) {
bp := bufferpool.New(1, 1)
bp.Give(bytes.NewBuffer([]byte("X")))
reusedBuf := bp.Take()
if !bytes.Equal(reusedBuf.Bytes(), []byte("")) {
t.Fatalf("Buffer from filled bufferpool was recycled incorrectly.")
}
}
func ExampleNew() {
bp := bufferpool.New(10, 255)
dogBuffer := bp.Take()
dogBuffer.writeString("Dog!")
bp.Give(dogBuffer)
catBuffer := bp.Take() // dogBuffer is reused and reset.
catBuffer.WriteString("Cat!")
fmt.Println(catBuffer)
// Output:
// Cat!
}
| Make the example reflect a normal use-case | Make the example reflect a normal use-case | Go | bsd-2-clause | pushrax/bufferpool | go | ## Code Before:
// Copyright 2013 The Bufferpool Authors. All rights reserved.
// Use of this source code is governed by the BSD 2-Clause license,
// which can be found in the LICENSE file.
package bufferpool_test
import (
"bytes"
"fmt"
"testing"
"github.com/pushrax/bufferpool"
)
func TestTakeFromEmpty(t *testing.T) {
bp := bufferpool.New(1, 1)
poolBuf := bp.Take()
if !bytes.Equal(poolBuf.Bytes(), []byte("")) {
t.Fatalf("Buffer from empty bufferpool was allocated incorrectly.")
}
}
func TestTakeFromFilled(t *testing.T) {
bp := bufferpool.New(1, 1)
bp.Give(bytes.NewBuffer([]byte("X")))
reusedBuf := bp.Take()
if !bytes.Equal(reusedBuf.Bytes(), []byte("")) {
t.Fatalf("Buffer from filled bufferpool was recycled incorrectly.")
}
}
func ExampleNew() {
catBuffer := bytes.NewBuffer([]byte("cat"))
bp := bufferpool.New(10, catBuffer.Len())
bp.Give(catBuffer) // An error is returned, but not neccessary to check
reusedBuffer := bp.Take()
reusedBuffer.Write([]byte("dog"))
fmt.Println(reusedBuffer)
// Output:
// dog
}
## Instruction:
Make the example reflect a normal use-case
## Code After:
// Copyright 2013 The Bufferpool Authors. All rights reserved.
// Use of this source code is governed by the BSD 2-Clause license,
// which can be found in the LICENSE file.
package bufferpool_test
import (
"bytes"
"fmt"
"testing"
"github.com/pushrax/bufferpool"
)
func TestTakeFromEmpty(t *testing.T) {
bp := bufferpool.New(1, 1)
poolBuf := bp.Take()
if !bytes.Equal(poolBuf.Bytes(), []byte("")) {
t.Fatalf("Buffer from empty bufferpool was allocated incorrectly.")
}
}
func TestTakeFromFilled(t *testing.T) {
bp := bufferpool.New(1, 1)
bp.Give(bytes.NewBuffer([]byte("X")))
reusedBuf := bp.Take()
if !bytes.Equal(reusedBuf.Bytes(), []byte("")) {
t.Fatalf("Buffer from filled bufferpool was recycled incorrectly.")
}
}
func ExampleNew() {
bp := bufferpool.New(10, 255)
dogBuffer := bp.Take()
dogBuffer.writeString("Dog!")
bp.Give(dogBuffer)
catBuffer := bp.Take() // dogBuffer is reused and reset.
catBuffer.WriteString("Cat!")
fmt.Println(catBuffer)
// Output:
// Cat!
}
|
2c67aee5d8d71d1e4859b265d254f4bb9e8d5154 | PUWI_LaunchBrowser.php | PUWI_LaunchBrowser.php | <?php
class PUWI_LaunchBrowser{
public function getProjectName($projectName){
$names=preg_split("/[\/]tests/",$projectName);
$projectName=explode("/",$names[0]);
$size=sizeof($projectName);
return $projectName[$size-1];
}
/*
*@param integer $totalTests
*@param string $projectName
*/
public function launchBrowser($totalTests,$projectName){
$projectName=PUWI_LaunchBrowser::getProjectName($projectName);
echo $projectName;
$url="http://localhost/view/puwi.php"."?projectName=".$projectName."\&totalTests=".$totalTests;
$command="x-www-browser ".$url." &";
system($command);
}
}
?>
| <?php
class PUWI_LaunchBrowser{
/*
*@param integer $totalTests
*@param string $projectName
*/
public function launchBrowser($totalTests,$projectName,$passed,$failures){
$passed=PUWI_LaunchBrowser::send_array($passed);
$failures=PUWI_LaunchBrowser::send_array($failures);
$projectName=PUWI_LaunchBrowser::getProjectName($projectName);
$url="http://localhost/view/puwi.php"."?projectName=".$projectName."\&totalTests=".$totalTests."\&passed=".$passed."\&failures=".$failures;
$command="x-www-browser ".$url." &";
system($command);
}
public function getProjectName($projectName){
$names=preg_split("/[\/]tests/",$projectName);
$projectName=explode("/",$names[0]);
$size=sizeof($projectName);
return $projectName[$size-1];
}
function send_array($array) {
$tmp = serialize($array);
$tmp = urlencode($tmp);
return $tmp;
}
}
?>
| Send information about single tests to browser | Send information about single tests to browser
| PHP | bsd-3-clause | LuciaPerez/PUWI,LuciaPerez/PUWI,LuciaPerez/PUWI | php | ## Code Before:
<?php
class PUWI_LaunchBrowser{
public function getProjectName($projectName){
$names=preg_split("/[\/]tests/",$projectName);
$projectName=explode("/",$names[0]);
$size=sizeof($projectName);
return $projectName[$size-1];
}
/*
*@param integer $totalTests
*@param string $projectName
*/
public function launchBrowser($totalTests,$projectName){
$projectName=PUWI_LaunchBrowser::getProjectName($projectName);
echo $projectName;
$url="http://localhost/view/puwi.php"."?projectName=".$projectName."\&totalTests=".$totalTests;
$command="x-www-browser ".$url." &";
system($command);
}
}
?>
## Instruction:
Send information about single tests to browser
## Code After:
<?php
class PUWI_LaunchBrowser{
/*
*@param integer $totalTests
*@param string $projectName
*/
public function launchBrowser($totalTests,$projectName,$passed,$failures){
$passed=PUWI_LaunchBrowser::send_array($passed);
$failures=PUWI_LaunchBrowser::send_array($failures);
$projectName=PUWI_LaunchBrowser::getProjectName($projectName);
$url="http://localhost/view/puwi.php"."?projectName=".$projectName."\&totalTests=".$totalTests."\&passed=".$passed."\&failures=".$failures;
$command="x-www-browser ".$url." &";
system($command);
}
public function getProjectName($projectName){
$names=preg_split("/[\/]tests/",$projectName);
$projectName=explode("/",$names[0]);
$size=sizeof($projectName);
return $projectName[$size-1];
}
function send_array($array) {
$tmp = serialize($array);
$tmp = urlencode($tmp);
return $tmp;
}
}
?>
|
a3e3e45701beb8dfd9b90cb33061d70e77c10e55 | spec/lib/vimwiki_markdown/options_spec.rb | spec/lib/vimwiki_markdown/options_spec.rb | require 'spec_helper'
require 'vimwiki_markdown/options'
module VimwikiMarkdown
describe Options do
subject { Options.new }
context "no options passed" do
before do
allow(Options).to receive(:arguments).and_return(Options::DEFAULTS)
end
its(:force) { should be(true) }
its(:syntax) { should eq('markdown') }
its(:output_fullpath) { should eq("#{subject.output_dir}#{subject.title.parameterize}.html") }
its(:template_filename) { should eq('~/vimwiki/templates/default.tpl') }
end
end
end
| require 'spec_helper'
require 'vimwiki_markdown/options'
module VimwikiMarkdown
describe Options do
subject { Options.new }
context "no options passed" do
before do
allow(Options).to receive(:arguments).and_return(Options::DEFAULTS)
end
its(:force) { should be(true) }
its(:syntax) { should eq('markdown') }
its(:output_fullpath) { should eq("#{subject.output_dir}#{subject.title.parameterize}.html") }
its(:template_filename) { should eq('~/vimwiki/templates/default.tpl') }
describe "extension" do
it "deals with a different wiki extension correctly" do
allow(Options).to receive(:arguments).and_return(
["1", #force - 1/0
"markdown",
"wiki",
"~/vimwiki/site_html/",
"~/vimwiki/index.wiki",
"~/vimwiki/site_html/style.css",
"~/vimwiki/templates/",
"default",
".tpl",
"-"]
)
expect(Options.new.title).to eq("Index")
end
end
end
end
end
| Add basic covering spec for extension | Add basic covering spec for extension
| Ruby | mit | patrickdavey/vimwiki_markdown | ruby | ## Code Before:
require 'spec_helper'
require 'vimwiki_markdown/options'
module VimwikiMarkdown
describe Options do
subject { Options.new }
context "no options passed" do
before do
allow(Options).to receive(:arguments).and_return(Options::DEFAULTS)
end
its(:force) { should be(true) }
its(:syntax) { should eq('markdown') }
its(:output_fullpath) { should eq("#{subject.output_dir}#{subject.title.parameterize}.html") }
its(:template_filename) { should eq('~/vimwiki/templates/default.tpl') }
end
end
end
## Instruction:
Add basic covering spec for extension
## Code After:
require 'spec_helper'
require 'vimwiki_markdown/options'
module VimwikiMarkdown
describe Options do
subject { Options.new }
context "no options passed" do
before do
allow(Options).to receive(:arguments).and_return(Options::DEFAULTS)
end
its(:force) { should be(true) }
its(:syntax) { should eq('markdown') }
its(:output_fullpath) { should eq("#{subject.output_dir}#{subject.title.parameterize}.html") }
its(:template_filename) { should eq('~/vimwiki/templates/default.tpl') }
describe "extension" do
it "deals with a different wiki extension correctly" do
allow(Options).to receive(:arguments).and_return(
["1", #force - 1/0
"markdown",
"wiki",
"~/vimwiki/site_html/",
"~/vimwiki/index.wiki",
"~/vimwiki/site_html/style.css",
"~/vimwiki/templates/",
"default",
".tpl",
"-"]
)
expect(Options.new.title).to eq("Index")
end
end
end
end
end
|
b0b6b878bf073707dba593f834f091775b2332d1 | .eslintrc.json | .eslintrc.json | {
"extends": "airbnb-base",
"env": {
"es6": true,
"mocha": true,
"node": true
},
"rules": {
"comma-dangle": ["error", {
"arrays": "always-multiline",
"objects": "always-multiline",
"imports": "always-multiline",
"exports": "always-multiline",
"functions": "ignore"
}],
"no-underscore-dangle": ["error", { "allow": ["_id"] }]
}
}
| {
"extends": "airbnb-base",
"env": {
"es6": true,
"mocha": true,
"node": true
},
"rules": {
"comma-dangle": ["error", {
"arrays": "always-multiline",
"objects": "always-multiline",
"imports": "always-multiline",
"exports": "always-multiline",
"functions": "ignore"
}],
"no-underscore-dangle": ["error", { "allow": ["_id"] }],
"no-unused-vars": ["error", { "argsIgnorePattern": "^_" } ]
}
}
| Update eslint config to allow unused vars | Update eslint config to allow unused vars
| JSON | mit | lentz/buddyduel,lentz/buddyduel,lentz/buddyduel | json | ## Code Before:
{
"extends": "airbnb-base",
"env": {
"es6": true,
"mocha": true,
"node": true
},
"rules": {
"comma-dangle": ["error", {
"arrays": "always-multiline",
"objects": "always-multiline",
"imports": "always-multiline",
"exports": "always-multiline",
"functions": "ignore"
}],
"no-underscore-dangle": ["error", { "allow": ["_id"] }]
}
}
## Instruction:
Update eslint config to allow unused vars
## Code After:
{
"extends": "airbnb-base",
"env": {
"es6": true,
"mocha": true,
"node": true
},
"rules": {
"comma-dangle": ["error", {
"arrays": "always-multiline",
"objects": "always-multiline",
"imports": "always-multiline",
"exports": "always-multiline",
"functions": "ignore"
}],
"no-underscore-dangle": ["error", { "allow": ["_id"] }],
"no-unused-vars": ["error", { "argsIgnorePattern": "^_" } ]
}
}
|
8db5ba61283460eed315863ecf6477891fc9b368 | devmgmtV2/controllers/ssid.controller.js | devmgmtV2/controllers/ssid.controller.js | "use strict"
const path = require('path');
const exec = require('child_process').exec;
const setSSID = (req, res) => {
const ssid = req.body['ssid'].trim();
let responseData = {
ssidSetSuccessful : true,
msg : 'Successfully set the ssid id to ' + ssid
}
if (typeof ssid === "undefined" || ssid.length < 1) {
responseData.ssidSetSuccessful = false;
responseData.msg = "Empty ssid not allowed!";
}
const cmd = path.join(__dirname, '../../CDN/modeChange.sh') + ' apmode ' + ssid
// executing the bash script for updating SSID
exec(cmd, { shell: '/bin/bash' }, (err, stdout, stderr) => {
console.log('Error: ' + err);
console.log('stdout: ' + stdout);
console.log('stderr: ' + stderr);
if(err) {
responseData.ssidSetSuccessful = false;
responseData.msg = err;
}
res.status(200).json(responseData);
});
}
module.exports = {
setSSID
}
| "use strict"
const path = require('path');
const exec = require('child_process').exec;
const setSSID = (req, res) => {
const ssid = req.body['ssid'].trim();
let responseData = {
ssidSetSuccessful : true,
msg : 'Successfully set the ssid id to ' + ssid
}
if (typeof ssid === "undefined" || ssid.length < 1) {
responseData.ssidSetSuccessful = false;
responseData.msg = "Empty ssid not allowed!";
res.status(200).json(responseData);
}
const cmd = path.join(__dirname, '../../CDN/modeChange.sh') + ' apmode ' + ssid
// executing the bash script for updating SSID
exec(cmd, { shell: '/bin/bash' }, (err, stdout, stderr) => {
console.log('Error: ' + err);
console.log('stdout: ' + stdout);
console.log('stderr: ' + stderr);
if(err) {
responseData.ssidSetSuccessful = false;
responseData.msg = err;
}
res.status(200).json(responseData);
});
}
module.exports = {
setSSID
}
| Return response without calling scripts if the input data is invalid | Return response without calling scripts if the input data is invalid
| JavaScript | mit | projectOpenRAP/OpenRAP,projectOpenRAP/OpenRAP,projectOpenRAP/OpenRAP,projectOpenRAP/OpenRAP,projectOpenRAP/OpenRAP | javascript | ## Code Before:
"use strict"
const path = require('path');
const exec = require('child_process').exec;
const setSSID = (req, res) => {
const ssid = req.body['ssid'].trim();
let responseData = {
ssidSetSuccessful : true,
msg : 'Successfully set the ssid id to ' + ssid
}
if (typeof ssid === "undefined" || ssid.length < 1) {
responseData.ssidSetSuccessful = false;
responseData.msg = "Empty ssid not allowed!";
}
const cmd = path.join(__dirname, '../../CDN/modeChange.sh') + ' apmode ' + ssid
// executing the bash script for updating SSID
exec(cmd, { shell: '/bin/bash' }, (err, stdout, stderr) => {
console.log('Error: ' + err);
console.log('stdout: ' + stdout);
console.log('stderr: ' + stderr);
if(err) {
responseData.ssidSetSuccessful = false;
responseData.msg = err;
}
res.status(200).json(responseData);
});
}
module.exports = {
setSSID
}
## Instruction:
Return response without calling scripts if the input data is invalid
## Code After:
"use strict"
const path = require('path');
const exec = require('child_process').exec;
const setSSID = (req, res) => {
const ssid = req.body['ssid'].trim();
let responseData = {
ssidSetSuccessful : true,
msg : 'Successfully set the ssid id to ' + ssid
}
if (typeof ssid === "undefined" || ssid.length < 1) {
responseData.ssidSetSuccessful = false;
responseData.msg = "Empty ssid not allowed!";
res.status(200).json(responseData);
}
const cmd = path.join(__dirname, '../../CDN/modeChange.sh') + ' apmode ' + ssid
// executing the bash script for updating SSID
exec(cmd, { shell: '/bin/bash' }, (err, stdout, stderr) => {
console.log('Error: ' + err);
console.log('stdout: ' + stdout);
console.log('stderr: ' + stderr);
if(err) {
responseData.ssidSetSuccessful = false;
responseData.msg = err;
}
res.status(200).json(responseData);
});
}
module.exports = {
setSSID
}
|
33349db083f51a35bb44cd53d9c5c73df7b4c1e5 | .rubocop.yml | .rubocop.yml | Style/Documentation:
Enabled: false
Lint/HandleExceptions:
Exclude:
- Rakefile
| Style/Documentation:
Enabled: false
Lint/HandleExceptions:
Exclude:
- Rakefile
Metrics/LineLength:
Max: 120
| Raise Rubocop's LineLength to 120 | Raise Rubocop's LineLength to 120
| YAML | mit | jbilbo/syfyfancam-downloader | yaml | ## Code Before:
Style/Documentation:
Enabled: false
Lint/HandleExceptions:
Exclude:
- Rakefile
## Instruction:
Raise Rubocop's LineLength to 120
## Code After:
Style/Documentation:
Enabled: false
Lint/HandleExceptions:
Exclude:
- Rakefile
Metrics/LineLength:
Max: 120
|
e00b93814fb7dc38b8ccde3e47d86687c35256a0 | coffee-chats/src/main/java/com/google/step/coffee/tasks/RequestMatchingTask.java | coffee-chats/src/main/java/com/google/step/coffee/tasks/RequestMatchingTask.java | package com.google.step.coffee.tasks;
import com.google.appengine.api.users.UserService;
import com.google.appengine.api.users.UserServiceFactory;
import com.google.step.coffee.PermissionChecker;
import com.google.step.coffee.UserManager;
import com.google.step.coffee.data.RequestStore;
import com.google.step.coffee.entity.ChatRequest;
import java.io.IOException;
import java.util.List;
import javax.servlet.ServletException;
import javax.servlet.annotation.WebServlet;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
/**
* Servlet triggered as cron job to match current chat requests together. Triggered periodically by
* fetch.
*/
@WebServlet("/api/tasks/request-matching")
public class RequestMatchingTask extends HttpServlet {
private RequestMatcher matcher = new RequestMatcher();
@Override
protected void doGet(HttpServletRequest req, HttpServletResponse resp)
throws ServletException, IOException {
String cronHeader = req.getHeader("X-Appengine-Cron");
UserService userService = UserServiceFactory.getUserService();
if (cronHeader == null || !Boolean.parseBoolean(cronHeader) || !userService.isUserAdmin()) {
resp.sendError(HttpServletResponse.SC_FORBIDDEN, "Forbidden action.");
return;
}
RequestStore requestStore = new RequestStore();
List<ChatRequest> requestList = requestStore.getUnmatchedRequests();
matcher.matchRequests(requestList, requestStore);
resp.setStatus(HttpServletResponse.SC_OK);
}
}
| package com.google.step.coffee.tasks;
import com.google.appengine.api.users.UserService;
import com.google.appengine.api.users.UserServiceFactory;
import com.google.step.coffee.PermissionChecker;
import com.google.step.coffee.UserManager;
import com.google.step.coffee.data.RequestStore;
import com.google.step.coffee.entity.ChatRequest;
import java.io.IOException;
import java.util.List;
import javax.servlet.ServletException;
import javax.servlet.annotation.WebServlet;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
/**
* Servlet triggered as cron job to match current chat requests together. Triggered periodically by
* fetch.
*/
@WebServlet("/api/tasks/request-matching")
public class RequestMatchingTask extends HttpServlet {
private RequestMatcher matcher = new RequestMatcher();
@Override
protected void doGet(HttpServletRequest req, HttpServletResponse resp)
throws ServletException, IOException {
String cronHeader = req.getHeader("X-Appengine-Cron");
UserService userService = UserServiceFactory.getUserService();
if (cronHeader == null || !Boolean.parseBoolean(cronHeader)) {
resp.sendError(HttpServletResponse.SC_FORBIDDEN, "Forbidden action.");
return;
}
RequestStore requestStore = new RequestStore();
List<ChatRequest> requestList = requestStore.getUnmatchedRequests();
matcher.matchRequests(requestList, requestStore);
resp.setStatus(HttpServletResponse.SC_OK);
}
}
| Change request matching cron to not allow admins to run directly | Change request matching cron to not allow admins to run directly
| Java | apache-2.0 | googleinterns/step250-2020,googleinterns/step250-2020,googleinterns/step250-2020,googleinterns/step250-2020 | java | ## Code Before:
package com.google.step.coffee.tasks;
import com.google.appengine.api.users.UserService;
import com.google.appengine.api.users.UserServiceFactory;
import com.google.step.coffee.PermissionChecker;
import com.google.step.coffee.UserManager;
import com.google.step.coffee.data.RequestStore;
import com.google.step.coffee.entity.ChatRequest;
import java.io.IOException;
import java.util.List;
import javax.servlet.ServletException;
import javax.servlet.annotation.WebServlet;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
/**
* Servlet triggered as cron job to match current chat requests together. Triggered periodically by
* fetch.
*/
@WebServlet("/api/tasks/request-matching")
public class RequestMatchingTask extends HttpServlet {
private RequestMatcher matcher = new RequestMatcher();
@Override
protected void doGet(HttpServletRequest req, HttpServletResponse resp)
throws ServletException, IOException {
String cronHeader = req.getHeader("X-Appengine-Cron");
UserService userService = UserServiceFactory.getUserService();
if (cronHeader == null || !Boolean.parseBoolean(cronHeader) || !userService.isUserAdmin()) {
resp.sendError(HttpServletResponse.SC_FORBIDDEN, "Forbidden action.");
return;
}
RequestStore requestStore = new RequestStore();
List<ChatRequest> requestList = requestStore.getUnmatchedRequests();
matcher.matchRequests(requestList, requestStore);
resp.setStatus(HttpServletResponse.SC_OK);
}
}
## Instruction:
Change request matching cron to not allow admins to run directly
## Code After:
package com.google.step.coffee.tasks;
import com.google.appengine.api.users.UserService;
import com.google.appengine.api.users.UserServiceFactory;
import com.google.step.coffee.PermissionChecker;
import com.google.step.coffee.UserManager;
import com.google.step.coffee.data.RequestStore;
import com.google.step.coffee.entity.ChatRequest;
import java.io.IOException;
import java.util.List;
import javax.servlet.ServletException;
import javax.servlet.annotation.WebServlet;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
/**
* Servlet triggered as cron job to match current chat requests together. Triggered periodically by
* fetch.
*/
@WebServlet("/api/tasks/request-matching")
public class RequestMatchingTask extends HttpServlet {
private RequestMatcher matcher = new RequestMatcher();
@Override
protected void doGet(HttpServletRequest req, HttpServletResponse resp)
throws ServletException, IOException {
String cronHeader = req.getHeader("X-Appengine-Cron");
UserService userService = UserServiceFactory.getUserService();
if (cronHeader == null || !Boolean.parseBoolean(cronHeader)) {
resp.sendError(HttpServletResponse.SC_FORBIDDEN, "Forbidden action.");
return;
}
RequestStore requestStore = new RequestStore();
List<ChatRequest> requestList = requestStore.getUnmatchedRequests();
matcher.matchRequests(requestList, requestStore);
resp.setStatus(HttpServletResponse.SC_OK);
}
}
|
201579d1cdeadb0bacaf4e9b7b13e4d89a0a6054 | planner/templates/planner/signup.html | planner/templates/planner/signup.html | <!DOCTYPE html>
<html lang="it">
<head>
<title>Sign up</title>
{% include 'planner/elements/base_head.html' %}
</head>
<body>
{% if user_signup.errors %}
{{ user_signup.error }}
{% endif %}
{% if profile_signup.errors %}
{{ profile_signup.errors }}
{% endif %}
<form method="post" action="{% url 'planner:signup' %}">
{% csrf_token %}
{{ user_signup.as_p }}
{{ profile_signup.as_p }}
<button>Sign up</button>
</form>
</body>
</html> | <!DOCTYPE html>
<html lang="it">
<head>
<title>Sign up</title>
{% include 'planner/elements/base_head.html' %}
</head>
<body>
<form method="post" action="{% url 'planner:signup' %}">
{% csrf_token %}
{{ user_signup.as_p }}
{{ profile_signup.as_p }}
<button>Sign up</button>
</form>
</body>
</html> | Remove duplicate errors. They already appear close to the input if any | Remove duplicate errors. They already appear close to the input if any
| HTML | mit | livingsilver94/getaride,livingsilver94/getaride,livingsilver94/getaride | html | ## Code Before:
<!DOCTYPE html>
<html lang="it">
<head>
<title>Sign up</title>
{% include 'planner/elements/base_head.html' %}
</head>
<body>
{% if user_signup.errors %}
{{ user_signup.error }}
{% endif %}
{% if profile_signup.errors %}
{{ profile_signup.errors }}
{% endif %}
<form method="post" action="{% url 'planner:signup' %}">
{% csrf_token %}
{{ user_signup.as_p }}
{{ profile_signup.as_p }}
<button>Sign up</button>
</form>
</body>
</html>
## Instruction:
Remove duplicate errors. They already appear close to the input if any
## Code After:
<!DOCTYPE html>
<html lang="it">
<head>
<title>Sign up</title>
{% include 'planner/elements/base_head.html' %}
</head>
<body>
<form method="post" action="{% url 'planner:signup' %}">
{% csrf_token %}
{{ user_signup.as_p }}
{{ profile_signup.as_p }}
<button>Sign up</button>
</form>
</body>
</html> |
f84bc64e41c2f33a5d6224cc46ab905c4a429352 | .travis.yml | .travis.yml | dist: trusty
before_install:
- npm install -g npm@latest
- npm set progress=false
language: node_js
node_js:
- "5.7"
- "4.3"
- "3.3"
- "0.10"
script: npm run test:ci
after_success:
- curl -Lo travis_after_all.py https://raw.githubusercontent.com/contentful/travis_after_all/master/travis_after_all.py
- python travis_after_all.py
- cat ./coverage/lcov.info | ./node_modules/.bin/coveralls
- export $(cat .to_export_back) &> /dev/null
- npm run semantic-release
addons:
sauce_connect: true
branches:
except:
- "/^v\\d+\\.\\d+\\.\\d+$/"
| dist: trusty
before_install:
- npm install -g npm@latest
- npm set progress=false
language: node_js
node_js:
- "5.7"
- "0.10"
script: npm run test:ci
after_success:
- curl -Lo travis_after_all.py https://raw.githubusercontent.com/contentful/travis_after_all/master/travis_after_all.py
- python travis_after_all.py
- cat ./coverage/lcov.info | ./node_modules/.bin/coveralls
- export $(cat .to_export_back) &> /dev/null
- npm run semantic-release
addons:
sauce_connect: true
branches:
except:
- "/^v\\d+\\.\\d+\\.\\d+$/"
| Remove node 3 and 5 builds | chore: Remove node 3 and 5 builds
Previous number of builds quickly blows up the space creation rate limit.
| YAML | mit | contentful/contentful-management.js,contentful/contentful-management.js | yaml | ## Code Before:
dist: trusty
before_install:
- npm install -g npm@latest
- npm set progress=false
language: node_js
node_js:
- "5.7"
- "4.3"
- "3.3"
- "0.10"
script: npm run test:ci
after_success:
- curl -Lo travis_after_all.py https://raw.githubusercontent.com/contentful/travis_after_all/master/travis_after_all.py
- python travis_after_all.py
- cat ./coverage/lcov.info | ./node_modules/.bin/coveralls
- export $(cat .to_export_back) &> /dev/null
- npm run semantic-release
addons:
sauce_connect: true
branches:
except:
- "/^v\\d+\\.\\d+\\.\\d+$/"
## Instruction:
chore: Remove node 3 and 5 builds
Previous number of builds quickly blows up the space creation rate limit.
## Code After:
dist: trusty
before_install:
- npm install -g npm@latest
- npm set progress=false
language: node_js
node_js:
- "5.7"
- "0.10"
script: npm run test:ci
after_success:
- curl -Lo travis_after_all.py https://raw.githubusercontent.com/contentful/travis_after_all/master/travis_after_all.py
- python travis_after_all.py
- cat ./coverage/lcov.info | ./node_modules/.bin/coveralls
- export $(cat .to_export_back) &> /dev/null
- npm run semantic-release
addons:
sauce_connect: true
branches:
except:
- "/^v\\d+\\.\\d+\\.\\d+$/"
|
f6d8cba8e8aaa443969737b494bf3e7fc000b8f4 | utils/scripts/header_setup/replace_all.sh | utils/scripts/header_setup/replace_all.sh |
SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
# Change to root compss directory
cd "${SCRIPT_DIR}/../../../compss/" || exit 1
# Add java headers
find . -name "*.java" -exec "${SCRIPT_DIR}/replace_header.sh" {} java_c \;
# Add c headers
find . -name "*.c" -exec "${SCRIPT_DIR}/replace_header.sh" {} java_c \;
find . -name "*.cc" -exec "${SCRIPT_DIR}/replace_header.sh" {} java_c \;
find . -name "*.h" -exec "${SCRIPT_DIR}/replace_header.sh" {} java_c \;
find . -name "Makefile*" -exec "${SCRIPT_DIR}/replace_header.sh" {} python \;
# Add python headers
find . -name "*.py" -exec "${SCRIPT_DIR}/replace_header.sh" {} python \;
| SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
INSPECTED_DIRS="compss maven-plugins performance_analysis utils/storage"
#
# HELPER METHODS
#
change_headers() {
# Add java headers
find . -name "*.java" -exec "${SCRIPT_DIR}/replace_header.sh" {} java_c \;
# Add c headers
find . -name "*.c" -exec "${SCRIPT_DIR}/replace_header.sh" {} java_c \;
find . -name "*.cc" -exec "${SCRIPT_DIR}/replace_header.sh" {} java_c \;
find . -name "*.h" -exec "${SCRIPT_DIR}/replace_header.sh" {} java_c \;
find . -name "Makefile*" -exec "${SCRIPT_DIR}/replace_header.sh" {} python \;
# Add python headers
find . -name "*.py" -exec "${SCRIPT_DIR}/replace_header.sh" {} python \;
}
#
# MAIN METHOD
#
main() {
echo "[INFO] Updating headers..."
for inspect_dir in ${INSPECTED_DIRS}; do
echo "[INFO] Updating headers on ${inspect_dir}..."
cd "${SCRIPT_DIR}/../../../${inspect_dir}" || exit 1
change_headers
cd "${SCRIPT_DIR}" || exit 1
echo "[INFO] Headers updated on ${inspect_dir}"
done
echo "DONE"
}
#
# ENTRY POINT
#
main
| Update replace headers script to handle all repo files | Update replace headers script to handle all repo files
| Shell | apache-2.0 | mF2C/COMPSs,mF2C/COMPSs,mF2C/COMPSs,mF2C/COMPSs,mF2C/COMPSs,mF2C/COMPSs | shell | ## Code Before:
SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
# Change to root compss directory
cd "${SCRIPT_DIR}/../../../compss/" || exit 1
# Add java headers
find . -name "*.java" -exec "${SCRIPT_DIR}/replace_header.sh" {} java_c \;
# Add c headers
find . -name "*.c" -exec "${SCRIPT_DIR}/replace_header.sh" {} java_c \;
find . -name "*.cc" -exec "${SCRIPT_DIR}/replace_header.sh" {} java_c \;
find . -name "*.h" -exec "${SCRIPT_DIR}/replace_header.sh" {} java_c \;
find . -name "Makefile*" -exec "${SCRIPT_DIR}/replace_header.sh" {} python \;
# Add python headers
find . -name "*.py" -exec "${SCRIPT_DIR}/replace_header.sh" {} python \;
## Instruction:
Update replace headers script to handle all repo files
## Code After:
SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
INSPECTED_DIRS="compss maven-plugins performance_analysis utils/storage"
#
# HELPER METHODS
#
change_headers() {
# Add java headers
find . -name "*.java" -exec "${SCRIPT_DIR}/replace_header.sh" {} java_c \;
# Add c headers
find . -name "*.c" -exec "${SCRIPT_DIR}/replace_header.sh" {} java_c \;
find . -name "*.cc" -exec "${SCRIPT_DIR}/replace_header.sh" {} java_c \;
find . -name "*.h" -exec "${SCRIPT_DIR}/replace_header.sh" {} java_c \;
find . -name "Makefile*" -exec "${SCRIPT_DIR}/replace_header.sh" {} python \;
# Add python headers
find . -name "*.py" -exec "${SCRIPT_DIR}/replace_header.sh" {} python \;
}
#
# MAIN METHOD
#
main() {
echo "[INFO] Updating headers..."
for inspect_dir in ${INSPECTED_DIRS}; do
echo "[INFO] Updating headers on ${inspect_dir}..."
cd "${SCRIPT_DIR}/../../../${inspect_dir}" || exit 1
change_headers
cd "${SCRIPT_DIR}" || exit 1
echo "[INFO] Headers updated on ${inspect_dir}"
done
echo "DONE"
}
#
# ENTRY POINT
#
main
|
a8432a4cf010ecce3202dc05da512fed9c967f79 | README.md | README.md | A [Docker](https://www.docker.com/) container with
- node 0.10 (latest)
- npm 2.7.x
- chrome + firefox for karma headless testing
| A [Docker](https://www.docker.com/) container with
- node 0.10 (latest)
- npm 2.7.x
- chrome + firefox for karma headless testing
```bash
# run it
$ docker run emmenko/nodejs-karma:v0.1.3 sh -lc "ps aux |grep Xvfb"
stdin: is not a tty
Starting virtual X frame buffer: Xvfb.
root 1 0.0 0.0 4444 1708 ? Ss 09:45 0:00 sh -lc ps aux |grep Xvfb
root 12 0.0 0.0 4344 976 ? R 09:45 0:00 start-stop-daemon --start --quiet --pidfile /var/run/xvfb.pid --make-pidfile --background --exec /usr/bin/Xvfb -- :99 -screen 0 1024x768x24 -ac +extension GLX +render -noreset
root 14 0.0 0.0 4444 120 ? R 09:45 0:00 sh -lc ps aux |grep Xvfb
``` | Add example how to run it | Add example how to run it
| Markdown | mit | nitrojs/docker-nodejs-karma,emmenko/docker-nodejs-karma | markdown | ## Code Before:
A [Docker](https://www.docker.com/) container with
- node 0.10 (latest)
- npm 2.7.x
- chrome + firefox for karma headless testing
## Instruction:
Add example how to run it
## Code After:
A [Docker](https://www.docker.com/) container with
- node 0.10 (latest)
- npm 2.7.x
- chrome + firefox for karma headless testing
```bash
# run it
$ docker run emmenko/nodejs-karma:v0.1.3 sh -lc "ps aux |grep Xvfb"
stdin: is not a tty
Starting virtual X frame buffer: Xvfb.
root 1 0.0 0.0 4444 1708 ? Ss 09:45 0:00 sh -lc ps aux |grep Xvfb
root 12 0.0 0.0 4344 976 ? R 09:45 0:00 start-stop-daemon --start --quiet --pidfile /var/run/xvfb.pid --make-pidfile --background --exec /usr/bin/Xvfb -- :99 -screen 0 1024x768x24 -ac +extension GLX +render -noreset
root 14 0.0 0.0 4444 120 ? R 09:45 0:00 sh -lc ps aux |grep Xvfb
``` |
7afba7da053e3ee76e429b735fa92a93d5d5ed4c | python/README.md | python/README.md | * [Solving problems using Python](#solving-problems-using-python)
* [Q1: Join elements in an array](#q1-join-elements-in-an-array)
* [Q2: Pop elements in an array](#q2-pop-elements-in-an-array)
* [Q3: Print n elements in a string](#q3-print-n-elements-in-a-string)
##Q1: Join elements in an array
a1 = ['this','is','a','sentence']
a2 = '### '.join(a1)
print a2
##Q2: Pop elements in an array
a1 = ['this','is','a','sentence']
print a1.pop(-1) #last element in array
print a1.pop(0) #first element in array
print a1.pop(1) #new second element in array
##Q3: Print n elements in a string
a1 = "Hello World!"
#print 5 characters starting from position 0
print a1[0:5] #Hello
[![Analytics](https://ga-beacon.appspot.com/UA-55381661-1/tools/cmd/readme)](https://github.com/igrigorik/ga-beacon)
| * [Solving problems using Python](#solving-problems-using-python)
* [Q1: Join elements in an array](#q1-join-elements-in-an-array)
* [Q2: Pop elements in an array](#q2-pop-elements-in-an-array)
* [Q3: Print n elements in a string](#q3-print-n-elements-in-a-string)
##Q1: Join elements in an array
a1 = ['this','is','a','sentence']
a2 = '### '.join(a1)
print a2
##Q2: Pop elements in an array
a1 = ['this','is','a','sentence']
print a1.pop(-1) #last element in array
print a1.pop(0) #first element in array
print a1.pop(1) #new second element in array
##Q3: Print n elements in a string
a1 = "Hello World!"
#print 5 characters starting from position 0
print a1[0:5] #Hello
##Q4: How can you get more information about python modules?
$>pip freeze #list all python modules and their version
$>pip freeze > requirements.txt #generate requirements file
$>pip show py2neo #show information about py2neo; output includes version #
$>
[![Analytics](https://ga-beacon.appspot.com/UA-55381661-1/tools/cmd/readme)](https://github.com/igrigorik/ga-beacon)
| Add question to get information about python modules | Add question to get information about python modules
| Markdown | mit | harishvc/tools,harishvc/tools,harishvc/tools,harishvc/tools,harishvc/tools | markdown | ## Code Before:
* [Solving problems using Python](#solving-problems-using-python)
* [Q1: Join elements in an array](#q1-join-elements-in-an-array)
* [Q2: Pop elements in an array](#q2-pop-elements-in-an-array)
* [Q3: Print n elements in a string](#q3-print-n-elements-in-a-string)
##Q1: Join elements in an array
a1 = ['this','is','a','sentence']
a2 = '### '.join(a1)
print a2
##Q2: Pop elements in an array
a1 = ['this','is','a','sentence']
print a1.pop(-1) #last element in array
print a1.pop(0) #first element in array
print a1.pop(1) #new second element in array
##Q3: Print n elements in a string
a1 = "Hello World!"
#print 5 characters starting from position 0
print a1[0:5] #Hello
[![Analytics](https://ga-beacon.appspot.com/UA-55381661-1/tools/cmd/readme)](https://github.com/igrigorik/ga-beacon)
## Instruction:
Add question to get information about python modules
## Code After:
* [Solving problems using Python](#solving-problems-using-python)
* [Q1: Join elements in an array](#q1-join-elements-in-an-array)
* [Q2: Pop elements in an array](#q2-pop-elements-in-an-array)
* [Q3: Print n elements in a string](#q3-print-n-elements-in-a-string)
##Q1: Join elements in an array
a1 = ['this','is','a','sentence']
a2 = '### '.join(a1)
print a2
##Q2: Pop elements in an array
a1 = ['this','is','a','sentence']
print a1.pop(-1) #last element in array
print a1.pop(0) #first element in array
print a1.pop(1) #new second element in array
##Q3: Print n elements in a string
a1 = "Hello World!"
#print 5 characters starting from position 0
print a1[0:5] #Hello
##Q4: How can you get more information about python modules?
$>pip freeze #list all python modules and their version
$>pip freeze > requirements.txt #generate requirements file
$>pip show py2neo #show information about py2neo; output includes version #
$>
[![Analytics](https://ga-beacon.appspot.com/UA-55381661-1/tools/cmd/readme)](https://github.com/igrigorik/ga-beacon)
|
2a4a0c4d71dcbc1820de85c4576aadae7bc17c70 | src/backend/nls.mk | src/backend/nls.mk | CATALOG_NAME := postgres
AVAIL_LANGUAGES := cs de es hr hu ru sv tr zh_CN zh_TW
GETTEXT_FILES := + gettext-files
GETTEXT_TRIGGERS:= elog:2 postmaster_error yyerror
gettext-files:
find $(srcdir)/ -name '*.c' -print >$@
my-maintainer-clean:
rm -f gettext-files
.PHONY: my-maintainer-clean
maintainer-clean: my-maintainer-clean
| CATALOG_NAME := postgres
AVAIL_LANGUAGES := cs de es hr hu ru sv tr zh_CN zh_TW
GETTEXT_FILES := + gettext-files
# elog should eventually be removed from this list:
GETTEXT_TRIGGERS:= elog:2 errmsg errdetail errhint errcontext postmaster_error yyerror
gettext-files:
find $(srcdir)/ -name '*.c' -print >$@
my-maintainer-clean:
rm -f gettext-files
.PHONY: my-maintainer-clean
maintainer-clean: my-maintainer-clean
| Add ereport-related functions to GETTEXT_TRIGGERS list. | Add ereport-related functions to GETTEXT_TRIGGERS list.
| Makefile | apache-2.0 | lintzc/gpdb,atris/gpdb,rubikloud/gpdb,janebeckman/gpdb,rvs/gpdb,kmjungersen/PostgresXL,xinzweb/gpdb,randomtask1155/gpdb,foyzur/gpdb,lisakowen/gpdb,50wu/gpdb,50wu/gpdb,CraigHarris/gpdb,zeroae/postgres-xl,rubikloud/gpdb,janebeckman/gpdb,atris/gpdb,tpostgres-projects/tPostgres,techdragon/Postgres-XL,lpetrov-pivotal/gpdb,greenplum-db/gpdb,zaksoup/gpdb,Chibin/gpdb,adam8157/gpdb,xinzweb/gpdb,zaksoup/gpdb,Quikling/gpdb,xuegang/gpdb,lisakowen/gpdb,atris/gpdb,cjcjameson/gpdb,chrishajas/gpdb,royc1/gpdb,yuanzhao/gpdb,lisakowen/gpdb,oberstet/postgres-xl,royc1/gpdb,Quikling/gpdb,atris/gpdb,lintzc/gpdb,xinzweb/gpdb,randomtask1155/gpdb,royc1/gpdb,lpetrov-pivotal/gpdb,ahachete/gpdb,greenplum-db/gpdb,xinzweb/gpdb,lisakowen/gpdb,tangp3/gpdb,foyzur/gpdb,atris/gpdb,kaknikhil/gpdb,jmcatamney/gpdb,foyzur/gpdb,edespino/gpdb,Chibin/gpdb,kmjungersen/PostgresXL,pavanvd/postgres-xl,techdragon/Postgres-XL,janebeckman/gpdb,rubikloud/gpdb,lpetrov-pivotal/gpdb,Postgres-XL/Postgres-XL,edespino/gpdb,CraigHarris/gpdb,pavanvd/postgres-xl,greenplum-db/gpdb,rvs/gpdb,zeroae/postgres-xl,rvs/gpdb,ashwinstar/gpdb,Postgres-XL/Postgres-XL,yazun/postgres-xl,50wu/gpdb,edespino/gpdb,randomtask1155/gpdb,ovr/postgres-xl,foyzur/gpdb,Quikling/gpdb,tpostgres-projects/tPostgres,Postgres-XL/Postgres-XL,janebeckman/gpdb,rubikloud/gpdb,Chibin/gpdb,greenplum-db/gpdb,chrishajas/gpdb,tangp3/gpdb,kaknikhil/gpdb,kmjungersen/PostgresXL,CraigHarris/gpdb,cjcjameson/gpdb,pavanvd/postgres-xl,Quikling/gpdb,CraigHarris/gpdb,chrishajas/gpdb,janebeckman/gpdb,edespino/gpdb,tpostgres-projects/tPostgres,xinzweb/gpdb,snaga/postgres-xl,lintzc/gpdb,jmcatamney/gpdb,jmcatamney/gpdb,kaknikhil/gpdb,xinzweb/gpdb,50wu/gpdb,zeroae/postgres-xl,yuanzhao/gpdb,kaknikhil/gpdb,snaga/postgres-xl,50wu/gpdb,janebeckman/gpdb,janebeckman/gpdb,arcivanov/postgres-xl,ovr/postgres-xl,ahachete/gpdb,Chibin/gpdb,Chibin/gpdb,lisakowen/gpdb,royc1/gpdb,lpetrov-pivotal/gpdb,yuanzhao/gpdb,adam8157/gpdb,rubikloud/gpdb,xuegang/gpdb,ashwinstar/gpdb,zaksoup/gpdb,kaknikhil/gpdb,jmcatamney/gpdb,edespino/gpdb,rvs/gpdb,Quikling/gpdb,rvs/gpdb,chrishajas/gpdb,yuanzhao/gpdb,tangp3/gpdb,cjcjameson/gpdb,yazun/postgres-xl,kaknikhil/gpdb,royc1/gpdb,atris/gpdb,chrishajas/gpdb,ashwinstar/gpdb,cjcjameson/gpdb,foyzur/gpdb,adam8157/gpdb,CraigHarris/gpdb,adam8157/gpdb,royc1/gpdb,greenplum-db/gpdb,adam8157/gpdb,ahachete/gpdb,ashwinstar/gpdb,50wu/gpdb,janebeckman/gpdb,ovr/postgres-xl,xinzweb/gpdb,yuanzhao/gpdb,zaksoup/gpdb,rvs/gpdb,rubikloud/gpdb,jmcatamney/gpdb,ahachete/gpdb,Quikling/gpdb,0x0FFF/gpdb,rvs/gpdb,foyzur/gpdb,zaksoup/gpdb,cjcjameson/gpdb,lpetrov-pivotal/gpdb,randomtask1155/gpdb,arcivanov/postgres-xl,lintzc/gpdb,randomtask1155/gpdb,postmind-net/postgres-xl,lpetrov-pivotal/gpdb,adam8157/gpdb,rvs/gpdb,oberstet/postgres-xl,greenplum-db/gpdb,tangp3/gpdb,ovr/postgres-xl,Quikling/gpdb,Chibin/gpdb,rubikloud/gpdb,randomtask1155/gpdb,janebeckman/gpdb,yazun/postgres-xl,Chibin/gpdb,kmjungersen/PostgresXL,kaknikhil/gpdb,xuegang/gpdb,lintzc/gpdb,foyzur/gpdb,lisakowen/gpdb,cjcjameson/gpdb,foyzur/gpdb,lisakowen/gpdb,ahachete/gpdb,kmjungersen/PostgresXL,greenplum-db/gpdb,lintzc/gpdb,techdragon/Postgres-XL,zaksoup/gpdb,ovr/postgres-xl,xuegang/gpdb,Postgres-XL/Postgres-XL,rvs/gpdb,adam8157/gpdb,royc1/gpdb,kaknikhil/gpdb,arcivanov/postgres-xl,CraigHarris/gpdb,edespino/gpdb,yuanzhao/gpdb,CraigHarris/gpdb,cjcjameson/gpdb,50wu/gpdb,cjcjameson/gpdb,ahachete/gpdb,yuanzhao/gpdb,randomtask1155/gpdb,edespino/gpdb,ashwinstar/gpdb,Quikling/gpdb,royc1/gpdb,yuanzhao/gpdb,lintzc/gpdb,postmind-net/postgres-xl,lintzc/gpdb,0x0FFF/gpdb,0x0FFF/gpdb,techdragon/Postgres-XL,greenplum-db/gpdb,ahachete/gpdb,Chibin/gpdb,tpostgres-projects/tPostgres,adam8157/gpdb,edespino/gpdb,0x0FFF/gpdb,postmind-net/postgres-xl,0x0FFF/gpdb,50wu/gpdb,tangp3/gpdb,techdragon/Postgres-XL,CraigHarris/gpdb,pavanvd/postgres-xl,jmcatamney/gpdb,Chibin/gpdb,pavanvd/postgres-xl,lisakowen/gpdb,snaga/postgres-xl,CraigHarris/gpdb,0x0FFF/gpdb,atris/gpdb,rvs/gpdb,yazun/postgres-xl,0x0FFF/gpdb,kaknikhil/gpdb,yuanzhao/gpdb,xuegang/gpdb,cjcjameson/gpdb,xuegang/gpdb,edespino/gpdb,tangp3/gpdb,zaksoup/gpdb,randomtask1155/gpdb,oberstet/postgres-xl,ashwinstar/gpdb,chrishajas/gpdb,arcivanov/postgres-xl,lintzc/gpdb,0x0FFF/gpdb,Chibin/gpdb,jmcatamney/gpdb,zaksoup/gpdb,ashwinstar/gpdb,Quikling/gpdb,oberstet/postgres-xl,yuanzhao/gpdb,ashwinstar/gpdb,lpetrov-pivotal/gpdb,jmcatamney/gpdb,atris/gpdb,zeroae/postgres-xl,edespino/gpdb,xuegang/gpdb,cjcjameson/gpdb,yazun/postgres-xl,janebeckman/gpdb,snaga/postgres-xl,xuegang/gpdb,arcivanov/postgres-xl,kaknikhil/gpdb,tpostgres-projects/tPostgres,Postgres-XL/Postgres-XL,rubikloud/gpdb,zeroae/postgres-xl,chrishajas/gpdb,oberstet/postgres-xl,lpetrov-pivotal/gpdb,tangp3/gpdb,tangp3/gpdb,chrishajas/gpdb,Quikling/gpdb,postmind-net/postgres-xl,xinzweb/gpdb,ahachete/gpdb,snaga/postgres-xl,xuegang/gpdb,postmind-net/postgres-xl,arcivanov/postgres-xl | makefile | ## Code Before:
CATALOG_NAME := postgres
AVAIL_LANGUAGES := cs de es hr hu ru sv tr zh_CN zh_TW
GETTEXT_FILES := + gettext-files
GETTEXT_TRIGGERS:= elog:2 postmaster_error yyerror
gettext-files:
find $(srcdir)/ -name '*.c' -print >$@
my-maintainer-clean:
rm -f gettext-files
.PHONY: my-maintainer-clean
maintainer-clean: my-maintainer-clean
## Instruction:
Add ereport-related functions to GETTEXT_TRIGGERS list.
## Code After:
CATALOG_NAME := postgres
AVAIL_LANGUAGES := cs de es hr hu ru sv tr zh_CN zh_TW
GETTEXT_FILES := + gettext-files
# elog should eventually be removed from this list:
GETTEXT_TRIGGERS:= elog:2 errmsg errdetail errhint errcontext postmaster_error yyerror
gettext-files:
find $(srcdir)/ -name '*.c' -print >$@
my-maintainer-clean:
rm -f gettext-files
.PHONY: my-maintainer-clean
maintainer-clean: my-maintainer-clean
|
6f1ef6a2567963ba0b44fea68312a781022e7b97 | cleanup.sh | cleanup.sh |
git clean -df
git checkout app_prototype/config/routes.rb app_prototype/db/schema.rb
rm -rfv app_prototype/log/*
rm -rfv app_prototype/tmp/*
rm -rfv app_prototype/coverage
|
git clean -df
git checkout app_prototype/config/routes.rb app_prototype/db/schema.rb
find . -name ".DS_Store" -exec rm -f {} \;
rm -rfv app_prototype/.DS_Store
rm -rfv app_prototype/log/*
rm -rfv app_prototype/tmp/*
rm -rfv app_prototype/coverage
| Remove all instances of .DS_Store. | Remove all instances of .DS_Store.
| Shell | mit | carbonfive/raygun,carbonfive/raygun | shell | ## Code Before:
git clean -df
git checkout app_prototype/config/routes.rb app_prototype/db/schema.rb
rm -rfv app_prototype/log/*
rm -rfv app_prototype/tmp/*
rm -rfv app_prototype/coverage
## Instruction:
Remove all instances of .DS_Store.
## Code After:
git clean -df
git checkout app_prototype/config/routes.rb app_prototype/db/schema.rb
find . -name ".DS_Store" -exec rm -f {} \;
rm -rfv app_prototype/.DS_Store
rm -rfv app_prototype/log/*
rm -rfv app_prototype/tmp/*
rm -rfv app_prototype/coverage
|
fc361b827f4b77f56704177b41e4d6d2f96e3f2f | app/models/channel/email_send.rb | app/models/channel/email_send.rb |
require 'net/imap'
module Channel::EmailSend
def self.send(article, notification = false)
channel = Channel.find_by( area: 'Email::Outbound', active: true )
begin
# we need to require the channel backend individually otherwise we get a
# 'warning: toplevel constant Twitter referenced by Channel::Twitter' error e.g.
# so we have to convert the channel name to the filename via Rails String.underscore
# http://stem.ps/rails/2015/01/25/ruby-gotcha-toplevel-constant-referenced-by.html
require "channel/#{channel[:adapter].underscore}"
channel_object = Object.const_get("Channel::#{channel[:adapter]}")
channel_instance = channel_object.new
channel_instance.send(article, channel, notification)
channel_instance.disconnect
rescue => e
Rails.logger.error "Can't use Channel::#{channel[:adapter]}"
Rails.logger.error e.inspect
Rails.logger.error e.backtrace
end
end
end
|
require 'net/imap'
module Channel::EmailSend
def self.send(article, notification = false)
channel = Channel.find_by( area: 'Email::Outbound', active: true )
begin
# we need to require the channel backend individually otherwise we get a
# 'warning: toplevel constant Twitter referenced by Channel::Twitter' error e.g.
# so we have to convert the channel name to the filename via Rails String.underscore
# http://stem.ps/rails/2015/01/25/ruby-gotcha-toplevel-constant-referenced-by.html
require "channel/#{channel[:adapter].underscore}"
channel_object = Object.const_get("Channel::#{channel[:adapter]}")
channel_instance = channel_object.new
result = channel_instance.send(article, channel, notification)
channel_instance.disconnect
rescue => e
Rails.logger.error "Can't use Channel::#{channel[:adapter]}"
Rails.logger.error e.inspect
Rails.logger.error e.backtrace
end
result
end
end
| Return result. Needed for tests. | Return result. Needed for tests.
| Ruby | agpl-3.0 | zammad/zammad,zammad/zammad,monotek/zammad,monotek/zammad,monotek/zammad,monotek/zammad,zammad/zammad,zammad/zammad,monotek/zammad,zammad/zammad,zammad/zammad,monotek/zammad | ruby | ## Code Before:
require 'net/imap'
module Channel::EmailSend
def self.send(article, notification = false)
channel = Channel.find_by( area: 'Email::Outbound', active: true )
begin
# we need to require the channel backend individually otherwise we get a
# 'warning: toplevel constant Twitter referenced by Channel::Twitter' error e.g.
# so we have to convert the channel name to the filename via Rails String.underscore
# http://stem.ps/rails/2015/01/25/ruby-gotcha-toplevel-constant-referenced-by.html
require "channel/#{channel[:adapter].underscore}"
channel_object = Object.const_get("Channel::#{channel[:adapter]}")
channel_instance = channel_object.new
channel_instance.send(article, channel, notification)
channel_instance.disconnect
rescue => e
Rails.logger.error "Can't use Channel::#{channel[:adapter]}"
Rails.logger.error e.inspect
Rails.logger.error e.backtrace
end
end
end
## Instruction:
Return result. Needed for tests.
## Code After:
require 'net/imap'
module Channel::EmailSend
def self.send(article, notification = false)
channel = Channel.find_by( area: 'Email::Outbound', active: true )
begin
# we need to require the channel backend individually otherwise we get a
# 'warning: toplevel constant Twitter referenced by Channel::Twitter' error e.g.
# so we have to convert the channel name to the filename via Rails String.underscore
# http://stem.ps/rails/2015/01/25/ruby-gotcha-toplevel-constant-referenced-by.html
require "channel/#{channel[:adapter].underscore}"
channel_object = Object.const_get("Channel::#{channel[:adapter]}")
channel_instance = channel_object.new
result = channel_instance.send(article, channel, notification)
channel_instance.disconnect
rescue => e
Rails.logger.error "Can't use Channel::#{channel[:adapter]}"
Rails.logger.error e.inspect
Rails.logger.error e.backtrace
end
result
end
end
|
b394f79132d952be20baf15725715691ace69ced | web/slas-web/web/urls.py | web/slas-web/web/urls.py | from django.conf.urls import include, url
from django.contrib import admin
urlpatterns = [
url(r'^general/', include('general.urls', namespace='general')),
url(r'^apache/', include('apache.urls', namespace='apache')),
url(r'^bash/', include('bash.urls', namespace='bash')),
url(r'^admin/', include(admin.site.urls)),
# index
url(r'^$', 'general.views.status', name='index'),
url(r'^user/login/$', 'web.views.user_login'),
url(r'^user/auth$', 'web.views.user_auth'),
url(r'^user/logout/$', 'web.views.user_logout'),
url(r'^user/invalid_login/$', 'web.views.user_invalid_login'),
]
| from django.conf.urls import include, url
from django.contrib import admin
urlpatterns = [
url(r'^general/', include('general.urls', namespace='general')),
url(r'^apache/', include('apache.urls', namespace='apache')),
url(r'^bash/', include('bash.urls', namespace='bash')),
url(r'^admin/', include(admin.site.urls)),
# index
url(r'^$', 'general.views.status', name='index'),
url(r'^user/login/$', 'web.views.user_login'),
url(r'^user/auth$', 'web.views.user_auth'),
url(r'^user/logout/$', 'web.views.user_logout'),
url(r'^user/invalid_login/$', 'web.views.user_invalid_login'),
]
admin.site.site_header = 'SLAS web module administration tool'
| Change web admin page title | Change web admin page title
| Python | mit | chyla/slas,chyla/pat-lms,chyla/slas,chyla/pat-lms,chyla/slas,chyla/pat-lms,chyla/slas,chyla/slas,chyla/pat-lms,chyla/pat-lms,chyla/slas,chyla/pat-lms,chyla/slas,chyla/pat-lms | python | ## Code Before:
from django.conf.urls import include, url
from django.contrib import admin
urlpatterns = [
url(r'^general/', include('general.urls', namespace='general')),
url(r'^apache/', include('apache.urls', namespace='apache')),
url(r'^bash/', include('bash.urls', namespace='bash')),
url(r'^admin/', include(admin.site.urls)),
# index
url(r'^$', 'general.views.status', name='index'),
url(r'^user/login/$', 'web.views.user_login'),
url(r'^user/auth$', 'web.views.user_auth'),
url(r'^user/logout/$', 'web.views.user_logout'),
url(r'^user/invalid_login/$', 'web.views.user_invalid_login'),
]
## Instruction:
Change web admin page title
## Code After:
from django.conf.urls import include, url
from django.contrib import admin
urlpatterns = [
url(r'^general/', include('general.urls', namespace='general')),
url(r'^apache/', include('apache.urls', namespace='apache')),
url(r'^bash/', include('bash.urls', namespace='bash')),
url(r'^admin/', include(admin.site.urls)),
# index
url(r'^$', 'general.views.status', name='index'),
url(r'^user/login/$', 'web.views.user_login'),
url(r'^user/auth$', 'web.views.user_auth'),
url(r'^user/logout/$', 'web.views.user_logout'),
url(r'^user/invalid_login/$', 'web.views.user_invalid_login'),
]
admin.site.site_header = 'SLAS web module administration tool'
|
1eaae78c14b26378a606221eb61f97ec15134baa | src/gpl/test/simple01-td.py | src/gpl/test/simple01-td.py | from openroad import Design, Tech
import helpers
import gpl_aux
tech = Tech()
tech.readLiberty("./library/nangate45/NangateOpenCellLibrary_typical.lib")
tech.readLef("./nangate45.lef")
design = Design(tech)
design.readDef("./simple01-td.def")
design.evalTclString("create_clock -name core_clock -period 2 clk")
design.evalTclString("set_wire_rc -signal -layer metal3")
design.evalTclString("set_wire_rc -clock -layer metal5")
gpl_aux.global_placement(design, timing_driven=True)
design.evalTclString("estimate_parasitics -placement")
design.evalTclString("report_worst_slack")
def_file = helpers.make_result_file("simple01-td.def")
design.writeDef(def_file)
helpers.diff_files(def_file, "simple01-td.defok")
# source helpers.tcl
# set test_name simple01-td
# read_liberty ./library/nangate45/NangateOpenCellLibrary_typical.lib
# read_lef ./nangate45.lef
# read_def ./$test_name.def
# create_clock -name core_clock -period 2 clk
# set_wire_rc -signal -layer metal3
# set_wire_rc -clock -layer metal5
# global_placement -timing_driven
# # check reported wns
# estimate_parasitics -placement
# report_worst_slack
# set def_file [make_result_file $test_name.def]
# write_def $def_file
# diff_file $def_file $test_name.defok
| from openroad import Design, Tech
import helpers
import gpl_aux
tech = Tech()
tech.readLiberty("./library/nangate45/NangateOpenCellLibrary_typical.lib")
tech.readLef("./nangate45.lef")
design = Design(tech)
design.readDef("./simple01-td.def")
design.evalTclString("create_clock -name core_clock -period 2 clk")
design.evalTclString("set_wire_rc -signal -layer metal3")
design.evalTclString("set_wire_rc -clock -layer metal5")
gpl_aux.global_placement(design, timing_driven=True)
design.evalTclString("estimate_parasitics -placement")
design.evalTclString("report_worst_slack")
def_file = helpers.make_result_file("simple01-td.def")
design.writeDef(def_file)
helpers.diff_files(def_file, "simple01-td.defok")
| Remove dead code from test | Remove dead code from test
Signed-off-by: Don MacMillen <1f1e67e5fdb25d2e5cd18ddc0fee425272daab56@macmillen.net>
| Python | bsd-3-clause | The-OpenROAD-Project/OpenROAD,The-OpenROAD-Project/OpenROAD,The-OpenROAD-Project/OpenROAD,The-OpenROAD-Project/OpenROAD,QuantamHD/OpenROAD,The-OpenROAD-Project/OpenROAD,QuantamHD/OpenROAD,QuantamHD/OpenROAD,QuantamHD/OpenROAD,QuantamHD/OpenROAD | python | ## Code Before:
from openroad import Design, Tech
import helpers
import gpl_aux
tech = Tech()
tech.readLiberty("./library/nangate45/NangateOpenCellLibrary_typical.lib")
tech.readLef("./nangate45.lef")
design = Design(tech)
design.readDef("./simple01-td.def")
design.evalTclString("create_clock -name core_clock -period 2 clk")
design.evalTclString("set_wire_rc -signal -layer metal3")
design.evalTclString("set_wire_rc -clock -layer metal5")
gpl_aux.global_placement(design, timing_driven=True)
design.evalTclString("estimate_parasitics -placement")
design.evalTclString("report_worst_slack")
def_file = helpers.make_result_file("simple01-td.def")
design.writeDef(def_file)
helpers.diff_files(def_file, "simple01-td.defok")
# source helpers.tcl
# set test_name simple01-td
# read_liberty ./library/nangate45/NangateOpenCellLibrary_typical.lib
# read_lef ./nangate45.lef
# read_def ./$test_name.def
# create_clock -name core_clock -period 2 clk
# set_wire_rc -signal -layer metal3
# set_wire_rc -clock -layer metal5
# global_placement -timing_driven
# # check reported wns
# estimate_parasitics -placement
# report_worst_slack
# set def_file [make_result_file $test_name.def]
# write_def $def_file
# diff_file $def_file $test_name.defok
## Instruction:
Remove dead code from test
Signed-off-by: Don MacMillen <1f1e67e5fdb25d2e5cd18ddc0fee425272daab56@macmillen.net>
## Code After:
from openroad import Design, Tech
import helpers
import gpl_aux
tech = Tech()
tech.readLiberty("./library/nangate45/NangateOpenCellLibrary_typical.lib")
tech.readLef("./nangate45.lef")
design = Design(tech)
design.readDef("./simple01-td.def")
design.evalTclString("create_clock -name core_clock -period 2 clk")
design.evalTclString("set_wire_rc -signal -layer metal3")
design.evalTclString("set_wire_rc -clock -layer metal5")
gpl_aux.global_placement(design, timing_driven=True)
design.evalTclString("estimate_parasitics -placement")
design.evalTclString("report_worst_slack")
def_file = helpers.make_result_file("simple01-td.def")
design.writeDef(def_file)
helpers.diff_files(def_file, "simple01-td.defok")
|
1b6bd4316fe8cfafcf9af5460cfcbc58a4a689c2 | ruby/tracery.gemspec | ruby/tracery.gemspec |
file_list = Dir["lib/*.rb"]
test_file_list = Dir["test/*.rb"]
Gem::Specification.new do |s|
s.name = "tracery"
s.version = "0.7.2"
s.date = "2016-02-27"
s.summary = "A text expansion library"
s.description = <<EOF
Tracery is a library for text generation.
The text is expanded by traversing a grammar.
See the main github repo for examples and documentation.
EOF
s.authors = ["Kate Compton", "Eli Brody"]
s.email = "brodyeli@gmail.com"
s.files = file_list
s.test_files = test_file_list
s.homepage = "https://github.com/elib/tracery"
s.license = "Apache-2.0"
end |
file_list = Dir["lib/*.rb"]
test_file_list = Dir["test/*.rb"]
Gem::Specification.new do |s|
s.name = "tracery"
s.version = "0.7.3"
s.date = Date.today.to_s
s.summary = "A text expansion library"
s.description = <<EOF
Tracery is a library for text generation.
The text is expanded by traversing a grammar.
See the main github repo for examples and documentation.
EOF
s.authors = ["Kate Compton", "Eli Brody"]
s.email = "brodyeli@gmail.com"
s.files = file_list
s.test_files = test_file_list
s.homepage = "https://github.com/elib/tracery"
s.license = "Apache-2.0"
end | Fix gem date for future publishing. | Fix gem date for future publishing.
| Ruby | apache-2.0 | elib/tracery | ruby | ## Code Before:
file_list = Dir["lib/*.rb"]
test_file_list = Dir["test/*.rb"]
Gem::Specification.new do |s|
s.name = "tracery"
s.version = "0.7.2"
s.date = "2016-02-27"
s.summary = "A text expansion library"
s.description = <<EOF
Tracery is a library for text generation.
The text is expanded by traversing a grammar.
See the main github repo for examples and documentation.
EOF
s.authors = ["Kate Compton", "Eli Brody"]
s.email = "brodyeli@gmail.com"
s.files = file_list
s.test_files = test_file_list
s.homepage = "https://github.com/elib/tracery"
s.license = "Apache-2.0"
end
## Instruction:
Fix gem date for future publishing.
## Code After:
file_list = Dir["lib/*.rb"]
test_file_list = Dir["test/*.rb"]
Gem::Specification.new do |s|
s.name = "tracery"
s.version = "0.7.3"
s.date = Date.today.to_s
s.summary = "A text expansion library"
s.description = <<EOF
Tracery is a library for text generation.
The text is expanded by traversing a grammar.
See the main github repo for examples and documentation.
EOF
s.authors = ["Kate Compton", "Eli Brody"]
s.email = "brodyeli@gmail.com"
s.files = file_list
s.test_files = test_file_list
s.homepage = "https://github.com/elib/tracery"
s.license = "Apache-2.0"
end |
b71d8da933ef1a1fe4893398666ace720bcd8131 | packages/wr/wrapped.yaml | packages/wr/wrapped.yaml | homepage: https://github.com/google/hs-wrapped#readme
changelog-type: markdown
hash: 2852522f3c67c36aff10bbda7651b5410a642502d535b7e22f0e76bdc004d461
test-bench-deps: {}
maintainer: Andrew Pritchard <awpr@google.com>
synopsis: Newtypes to carry DerivingVia instances
changelog: |
# 0.1.0.0
Initial version.
# 0.1.0.1
* Extend support back to GHC 8.0.
basic-deps:
base: '>=4.9 && <4.16'
all-versions:
- 0.1.0.0
- 0.1.0.1
author: Andrew Pritchard <awpr@google.com>
latest: 0.1.0.1
description-type: haddock
description: |-
This exports the newtypes 'Wrapped' and 'Wrapped1', which are meant to hold
typeclass implementations derived from other classes (most frequently,
'Generic' and 'Generic1'). If you implement a general instance of a class,
add it as an instance for 'Wrapped' or 'Wrapped1', and users will be able to
derive it uniformly as @deriving TheClass via Wrapped TheType@.
license-name: Apache-2.0
| homepage: https://github.com/google/hs-wrapped#readme
changelog-type: markdown
hash: 7896256898d424e9e72bf4fc235701e64f76faa8ed9393c93e975338ddcd54ad
test-bench-deps: {}
maintainer: Andrew Pritchard <awpr@google.com>
synopsis: Newtypes to carry DerivingVia instances
changelog: |
# 0.1.0.0
Initial version.
# 0.1.0.1
* Extend support back to GHC 8.0.
basic-deps:
base: '>=4.9 && <4.17'
all-versions:
- 0.1.0.0
- 0.1.0.1
author: Andrew Pritchard <awpr@google.com>
latest: 0.1.0.1
description-type: haddock
description: |-
This exports the newtypes 'Wrapped' and 'Wrapped1', which are meant to hold
typeclass implementations derived from other classes (most frequently,
'Generic' and 'Generic1'). If you implement a general instance of a class,
add it as an instance for 'Wrapped' or 'Wrapped1', and users will be able to
derive it uniformly as @deriving TheClass via Wrapped TheType@.
license-name: Apache-2.0
| Update from Hackage at 2021-11-03T06:39:57Z | Update from Hackage at 2021-11-03T06:39:57Z
| YAML | mit | commercialhaskell/all-cabal-metadata | yaml | ## Code Before:
homepage: https://github.com/google/hs-wrapped#readme
changelog-type: markdown
hash: 2852522f3c67c36aff10bbda7651b5410a642502d535b7e22f0e76bdc004d461
test-bench-deps: {}
maintainer: Andrew Pritchard <awpr@google.com>
synopsis: Newtypes to carry DerivingVia instances
changelog: |
# 0.1.0.0
Initial version.
# 0.1.0.1
* Extend support back to GHC 8.0.
basic-deps:
base: '>=4.9 && <4.16'
all-versions:
- 0.1.0.0
- 0.1.0.1
author: Andrew Pritchard <awpr@google.com>
latest: 0.1.0.1
description-type: haddock
description: |-
This exports the newtypes 'Wrapped' and 'Wrapped1', which are meant to hold
typeclass implementations derived from other classes (most frequently,
'Generic' and 'Generic1'). If you implement a general instance of a class,
add it as an instance for 'Wrapped' or 'Wrapped1', and users will be able to
derive it uniformly as @deriving TheClass via Wrapped TheType@.
license-name: Apache-2.0
## Instruction:
Update from Hackage at 2021-11-03T06:39:57Z
## Code After:
homepage: https://github.com/google/hs-wrapped#readme
changelog-type: markdown
hash: 7896256898d424e9e72bf4fc235701e64f76faa8ed9393c93e975338ddcd54ad
test-bench-deps: {}
maintainer: Andrew Pritchard <awpr@google.com>
synopsis: Newtypes to carry DerivingVia instances
changelog: |
# 0.1.0.0
Initial version.
# 0.1.0.1
* Extend support back to GHC 8.0.
basic-deps:
base: '>=4.9 && <4.17'
all-versions:
- 0.1.0.0
- 0.1.0.1
author: Andrew Pritchard <awpr@google.com>
latest: 0.1.0.1
description-type: haddock
description: |-
This exports the newtypes 'Wrapped' and 'Wrapped1', which are meant to hold
typeclass implementations derived from other classes (most frequently,
'Generic' and 'Generic1'). If you implement a general instance of a class,
add it as an instance for 'Wrapped' or 'Wrapped1', and users will be able to
derive it uniformly as @deriving TheClass via Wrapped TheType@.
license-name: Apache-2.0
|
194044350a9e84b52dc911a82336b7e31b11c654 | tox.ini | tox.ini | [tox]
envlist = py{27,32,33}-django{16,17,18},psql,pep8,coverage
[testenv]
deps =
pytz
PyYAML
djangorestframework>=3.0
django16: Django>=1.6,<1.7
django17: Django>=1.7,<1.8
django18: https://www.djangoproject.com/download/1.8b1/tarball/
-rpip-test-requirements.txt
commands = py.test tests/
[testenv:psql]
deps =
psycopg2
-rpip-requirements.txt
-rpip-test-requirements.txt
passenv = PGHOST PGDATABASE PGUSER PGPASSWORD
setenv =
USE_POSTGRESQL_DATABASE=yes
commands = py.test tests/
[testenv:pep8]
deps = flake8
commands = flake8 caspy
[testenv:coverage]
deps =
-rpip-requirements.txt
-rpip-test-requirements.txt
commands = py.test --cov caspy/ tests/
[flake8]
ignore = E121,E126
| [tox]
envlist = py{27,33}-django{16,17},py{27,34,35}-django{18,19},psql,pep8,coverage
[testenv]
deps =
pytz
PyYAML
django16: Django>=1.6,<1.7
django16: djangorestframework>=3.0,<3.1
django17: Django>=1.7,<1.8
django17: djangorestframework>=3.1,<3.2
django18: Django>=1.8,<1.9
django18: djangorestframework>=3.2,<3.3
django19: Django==1.9b1
django19: djangorestframework>=3.3
-rpip-test-requirements.txt
commands = py.test tests/
[testenv:psql]
deps =
psycopg2
-rpip-requirements.txt
-rpip-test-requirements.txt
passenv = PGHOST PGDATABASE PGUSER PGPASSWORD
setenv =
USE_POSTGRESQL_DATABASE=yes
commands = py.test tests/
[testenv:pep8]
deps = flake8
commands = flake8 caspy
[testenv:coverage]
deps =
-rpip-requirements.txt
-rpip-test-requirements.txt
commands = py.test --cov caspy/ tests/
[flake8]
ignore = E121,E126
| Test only compatible version combinations | Test only compatible version combinations
| INI | bsd-3-clause | altaurog/django-caspy,altaurog/django-caspy,altaurog/django-caspy | ini | ## Code Before:
[tox]
envlist = py{27,32,33}-django{16,17,18},psql,pep8,coverage
[testenv]
deps =
pytz
PyYAML
djangorestframework>=3.0
django16: Django>=1.6,<1.7
django17: Django>=1.7,<1.8
django18: https://www.djangoproject.com/download/1.8b1/tarball/
-rpip-test-requirements.txt
commands = py.test tests/
[testenv:psql]
deps =
psycopg2
-rpip-requirements.txt
-rpip-test-requirements.txt
passenv = PGHOST PGDATABASE PGUSER PGPASSWORD
setenv =
USE_POSTGRESQL_DATABASE=yes
commands = py.test tests/
[testenv:pep8]
deps = flake8
commands = flake8 caspy
[testenv:coverage]
deps =
-rpip-requirements.txt
-rpip-test-requirements.txt
commands = py.test --cov caspy/ tests/
[flake8]
ignore = E121,E126
## Instruction:
Test only compatible version combinations
## Code After:
[tox]
envlist = py{27,33}-django{16,17},py{27,34,35}-django{18,19},psql,pep8,coverage
[testenv]
deps =
pytz
PyYAML
django16: Django>=1.6,<1.7
django16: djangorestframework>=3.0,<3.1
django17: Django>=1.7,<1.8
django17: djangorestframework>=3.1,<3.2
django18: Django>=1.8,<1.9
django18: djangorestframework>=3.2,<3.3
django19: Django==1.9b1
django19: djangorestframework>=3.3
-rpip-test-requirements.txt
commands = py.test tests/
[testenv:psql]
deps =
psycopg2
-rpip-requirements.txt
-rpip-test-requirements.txt
passenv = PGHOST PGDATABASE PGUSER PGPASSWORD
setenv =
USE_POSTGRESQL_DATABASE=yes
commands = py.test tests/
[testenv:pep8]
deps = flake8
commands = flake8 caspy
[testenv:coverage]
deps =
-rpip-requirements.txt
-rpip-test-requirements.txt
commands = py.test --cov caspy/ tests/
[flake8]
ignore = E121,E126
|
eba6d5a9b54fe4765588bae50083859014ef3b49 | .travis.yml | .travis.yml | language: python
sudo: false
env:
- CONDA="python=2.7"
- CONDA="python=3.4"
- CONDA="python=3.5"
before_install:
- URL=http://repo.continuum.io/miniconda/Miniconda-latest-Linux-x86_64.sh
- wget $URL -O miniconda.sh
- bash miniconda.sh -b -p $HOME/miniconda
- export PATH="$HOME/miniconda/bin:$PATH"
- conda update --yes --all
- conda config --add channels conda-forge --force
- travis_retry conda create --yes -n test --file requirements.txt $CONDA
- travis_retry conda install -n test --yes pytest flake8
- source activate test
install:
- pip install -r requirements.txt
- pip install -e .
script:
- py.test -s -rxs -v
- flake8 --ignore=E501,E251,E221,E201,E202,E203 -qq --statistics . || true
| language: python
sudo: false
env:
- CONDA="python=2.7"
- CONDA="python=3.4"
- CONDA="python=3.5"
before_install:
- URL=http://repo.continuum.io/miniconda/Miniconda-latest-Linux-x86_64.sh
- wget $URL -O miniconda.sh
- bash miniconda.sh -b -p $HOME/miniconda
- export PATH="$HOME/miniconda/bin:$PATH"
- conda update --yes --all
- conda config --add channels conda-forge --force
- travis_retry conda create --yes -n test --file requirements.txt $CONDA
- travis_retry conda install -n test --yes pytest flake8
- source activate test
install:
- pip install -r requirements.txt
- pip install -r test_requirements.txt
- pip install -e .
script:
- py.test -s -rxs -v
- flake8 --ignore=E501,E251,E221,E201,E202,E203 -qq --statistics . || true
| Add test_requirements to Travis build | Add test_requirements to Travis build
| YAML | apache-2.0 | petejan/compliance-checker,ocefpaf/compliance-checker,lukecampbell/compliance-checker,ioos/compliance-checker,aodn/compliance-checker,DanielJMaher/compliance-checker | yaml | ## Code Before:
language: python
sudo: false
env:
- CONDA="python=2.7"
- CONDA="python=3.4"
- CONDA="python=3.5"
before_install:
- URL=http://repo.continuum.io/miniconda/Miniconda-latest-Linux-x86_64.sh
- wget $URL -O miniconda.sh
- bash miniconda.sh -b -p $HOME/miniconda
- export PATH="$HOME/miniconda/bin:$PATH"
- conda update --yes --all
- conda config --add channels conda-forge --force
- travis_retry conda create --yes -n test --file requirements.txt $CONDA
- travis_retry conda install -n test --yes pytest flake8
- source activate test
install:
- pip install -r requirements.txt
- pip install -e .
script:
- py.test -s -rxs -v
- flake8 --ignore=E501,E251,E221,E201,E202,E203 -qq --statistics . || true
## Instruction:
Add test_requirements to Travis build
## Code After:
language: python
sudo: false
env:
- CONDA="python=2.7"
- CONDA="python=3.4"
- CONDA="python=3.5"
before_install:
- URL=http://repo.continuum.io/miniconda/Miniconda-latest-Linux-x86_64.sh
- wget $URL -O miniconda.sh
- bash miniconda.sh -b -p $HOME/miniconda
- export PATH="$HOME/miniconda/bin:$PATH"
- conda update --yes --all
- conda config --add channels conda-forge --force
- travis_retry conda create --yes -n test --file requirements.txt $CONDA
- travis_retry conda install -n test --yes pytest flake8
- source activate test
install:
- pip install -r requirements.txt
- pip install -r test_requirements.txt
- pip install -e .
script:
- py.test -s -rxs -v
- flake8 --ignore=E501,E251,E221,E201,E202,E203 -qq --statistics . || true
|
4940f2d9e0e016bf6310a6c85489740adbdab3c6 | _includes/case_studies/list.html | _includes/case_studies/list.html | <!-- Case studies Section -->
<div class="casestudy-list">
{% assign case_studies = site.case_studies | sort: 'id' %}
{% for post in case_studies reversed %}
{% if post.status contains 'published' %}
<article id="{{post.slug}}" class="casestudy-item">
<div class="intro">
<span>
<div class="text">
<h4 class="title">{{ post.title }}</h4>
<p class="client">{{ post.client }}</p>
<ul class="taglist">
<li class="tag">Tags:</li>
{% for tag in post.tags %}
<li class="tag">
<a>{{ tag }}</a>
</li>
{% endfor %}
</ul>
</div>
<div class="productimg">
<img src="/assets/img/case-studies/product/{{ post.slug }}.png" alt="" class="img">
</div>
</span>
<ul class="service-list">
<li class="serviceitem -title">Related services:</li>
{% for service in post.related-services %}
<li class="serviceitem">
<a class="link-pink capitalize" href="/services/{{ service }}">{{ service | replace: "-", " " }}</a>
</li>
{% endfor %}
</ul>
</div>
<div class="content">
{{ post.content | markdownify }}
</div>
</article>
{% endif %}
{% endfor %}
</div>
| <!-- Case studies Section -->
<div id="projects" class="casestudy-list">
{% assign case_studies = site.case_studies | sort: 'id' %}
{% for post in case_studies reversed %}
{% if post.status contains 'published' %}
<article id="{{post.slug}}" class="casestudy-item">
<div class="intro">
<span>
<div class="text">
<h4 class="title">{{ post.title }}</h4>
<p class="client">{{ post.client }}</p>
<ul class="taglist">
<li class="tag">Tags:</li>
{% for tag in post.tags %}
<li class="tag">
<a>{{ tag }}</a>
</li>
{% endfor %}
</ul>
</div>
<div class="productimg">
<img src="/assets/img/case-studies/product/{{ post.slug }}.png" alt="" class="img">
</div>
</span>
<ul class="service-list">
<li class="serviceitem -title">Related services:</li>
{% for service in post.related-services %}
<li class="serviceitem">
<a class="link-pink capitalize" href="/services/{{ service }}">{{ service | replace: "-", " " }}</a>
</li>
{% endfor %}
</ul>
</div>
<div class="content">
{{ post.content | markdownify }}
</div>
</article>
{% endif %}
{% endfor %}
</div>
| Make "see our work" button work again | Make "see our work" button work again
| HTML | apache-2.0 | RapidRiverSoftware/rapidriversoftware.github.io,RapidRiverSoftware/rapidriversoftware.github.io,RapidRiverSoftware/rapidriversoftware.github.io,RapidRiverSoftware/rapidriversoftware.github.io | html | ## Code Before:
<!-- Case studies Section -->
<div class="casestudy-list">
{% assign case_studies = site.case_studies | sort: 'id' %}
{% for post in case_studies reversed %}
{% if post.status contains 'published' %}
<article id="{{post.slug}}" class="casestudy-item">
<div class="intro">
<span>
<div class="text">
<h4 class="title">{{ post.title }}</h4>
<p class="client">{{ post.client }}</p>
<ul class="taglist">
<li class="tag">Tags:</li>
{% for tag in post.tags %}
<li class="tag">
<a>{{ tag }}</a>
</li>
{% endfor %}
</ul>
</div>
<div class="productimg">
<img src="/assets/img/case-studies/product/{{ post.slug }}.png" alt="" class="img">
</div>
</span>
<ul class="service-list">
<li class="serviceitem -title">Related services:</li>
{% for service in post.related-services %}
<li class="serviceitem">
<a class="link-pink capitalize" href="/services/{{ service }}">{{ service | replace: "-", " " }}</a>
</li>
{% endfor %}
</ul>
</div>
<div class="content">
{{ post.content | markdownify }}
</div>
</article>
{% endif %}
{% endfor %}
</div>
## Instruction:
Make "see our work" button work again
## Code After:
<!-- Case studies Section -->
<div id="projects" class="casestudy-list">
{% assign case_studies = site.case_studies | sort: 'id' %}
{% for post in case_studies reversed %}
{% if post.status contains 'published' %}
<article id="{{post.slug}}" class="casestudy-item">
<div class="intro">
<span>
<div class="text">
<h4 class="title">{{ post.title }}</h4>
<p class="client">{{ post.client }}</p>
<ul class="taglist">
<li class="tag">Tags:</li>
{% for tag in post.tags %}
<li class="tag">
<a>{{ tag }}</a>
</li>
{% endfor %}
</ul>
</div>
<div class="productimg">
<img src="/assets/img/case-studies/product/{{ post.slug }}.png" alt="" class="img">
</div>
</span>
<ul class="service-list">
<li class="serviceitem -title">Related services:</li>
{% for service in post.related-services %}
<li class="serviceitem">
<a class="link-pink capitalize" href="/services/{{ service }}">{{ service | replace: "-", " " }}</a>
</li>
{% endfor %}
</ul>
</div>
<div class="content">
{{ post.content | markdownify }}
</div>
</article>
{% endif %}
{% endfor %}
</div>
|
357c9129f289c0049b54f25dc5cb9ce49cdd19f8 | app/controllers/users/questions_controller.rb | app/controllers/users/questions_controller.rb | class Users::QuestionsController < ApplicationController
inherit_resources
belongs_to :user
def new
@project = Project.find(params[:project_id])
unless current_user
session[:return_to] = project_path(@project, anchor: 'open-new-user-question-modal')
return redirect_to new_user_session_path
end
@user = parent
render layout: false
end
def create
project = Project.find(params[:project_id])
unless current_user
session[:return_to] = project_path(project, anchor: 'open-new-user-question-modal')
return redirect_to new_user_session_path
end
Users::QuestionsMailer.new(params[:question][:body], parent, project, current_user).deliver
flash.notice = "#{parent.display_name} received your question and will be in touch shortly."
redirect_to project_path(project)
end
end
| class Users::QuestionsController < ApplicationController
def new
@project = Project.find(params[:project_id])
unless current_user
session[:return_to] = project_path(@project, anchor: 'open-new-user-question-modal')
return redirect_to new_user_session_path
end
@user = parent
render layout: false
end
def create
project = Project.find(params[:project_id])
unless current_user
session[:return_to] = project_path(project, anchor: 'open-new-user-question-modal')
return redirect_to new_user_session_path
end
Users::QuestionsMailer.new(params[:question][:body], parent, project, current_user).deliver
flash.notice = "#{parent.display_name} received your question and will be in touch shortly."
redirect_to project_path(project)
end
private
def parent
@user ||= User.find(params[:user_id])
end
end
| Remove inherit_resources from user questions controller | Remove inherit_resources from user questions controller
| Ruby | mit | gustavoguichard/neighborly,gustavoguichard/neighborly,gustavoguichard/neighborly | ruby | ## Code Before:
class Users::QuestionsController < ApplicationController
inherit_resources
belongs_to :user
def new
@project = Project.find(params[:project_id])
unless current_user
session[:return_to] = project_path(@project, anchor: 'open-new-user-question-modal')
return redirect_to new_user_session_path
end
@user = parent
render layout: false
end
def create
project = Project.find(params[:project_id])
unless current_user
session[:return_to] = project_path(project, anchor: 'open-new-user-question-modal')
return redirect_to new_user_session_path
end
Users::QuestionsMailer.new(params[:question][:body], parent, project, current_user).deliver
flash.notice = "#{parent.display_name} received your question and will be in touch shortly."
redirect_to project_path(project)
end
end
## Instruction:
Remove inherit_resources from user questions controller
## Code After:
class Users::QuestionsController < ApplicationController
def new
@project = Project.find(params[:project_id])
unless current_user
session[:return_to] = project_path(@project, anchor: 'open-new-user-question-modal')
return redirect_to new_user_session_path
end
@user = parent
render layout: false
end
def create
project = Project.find(params[:project_id])
unless current_user
session[:return_to] = project_path(project, anchor: 'open-new-user-question-modal')
return redirect_to new_user_session_path
end
Users::QuestionsMailer.new(params[:question][:body], parent, project, current_user).deliver
flash.notice = "#{parent.display_name} received your question and will be in touch shortly."
redirect_to project_path(project)
end
private
def parent
@user ||= User.find(params[:user_id])
end
end
|
3cc47a95bc58391ce6e95e8ead36313180b0c947 | src/components/AppReveal/AppReveal.vue | src/components/AppReveal/AppReveal.vue | <template>
<component
:is="tag"
ref="element">
<slot />
</component>
</template>
<script>
import anime from 'animejs'
export default {
name: 'AppReveal',
className: 'AppReveal',
props: {
active: {
type: Boolean,
default: false
},
tag: {
type: String,
default: 'div'
},
delay: {
type: Number,
default: 0
}
},
computed: {
target() {
return this.$refs.element
}
},
watch: {
active(val) {
if (val) {
this.animationPlay()
} else {
this.animationReverse()
}
}
},
mounted() {
this.animationCreate()
},
methods: {
animationCreate() {
this.animation = anime.timeline({
loop: false,
autoplay: false
})
.add({
targets: this.target,
skew: ['-10deg', '0deg'],
translateX: [10, 0],
translateZ: 0,
opacity: [0, 1],
easing: 'easeInOutQuad',
duration: 500,
delay: 250 + this.delay
})
},
animationPlay() {
this.animation.play()
},
animationReverse() {
this.animation.reverse()
}
}
}
</script>
<style lang="scss" scoped>
.AppReveal {
opacity: 0;
}
</style>
| <template>
<component
:is="tag"
ref="element"
:class="$options.className">
<slot />
</component>
</template>
<script>
import anime from 'animejs'
export default {
name: 'AppReveal',
className: 'AppReveal',
props: {
active: {
type: Boolean,
default: false
},
tag: {
type: String,
default: 'div'
},
delay: {
type: Number,
default: 0
}
},
computed: {
target() {
return this.$refs.element
}
},
watch: {
active(val) {
if (val) {
this.animationPlay()
} else {
this.animationReverse()
}
}
},
mounted() {
this.animationCreate()
},
methods: {
animationCreate() {
this.animation = anime.timeline({
loop: false,
autoplay: false
})
.add({
targets: this.target,
skew: ['-10deg', '0deg'],
translateX: [10, 0],
translateZ: 0,
opacity: [0, 1],
easing: 'easeInOutQuad',
duration: 500,
delay: 250 + this.delay
})
},
animationPlay() {
this.animation.play()
},
animationReverse() {
this.animation.reverse()
}
}
}
</script>
<style lang="scss" scoped>
.AppReveal {
opacity: 0;
}
</style>
| Add class to attach the styles to. | Add class to attach the styles to.
| Vue | agpl-3.0 | michaelpumo/michaelpumo.github.io,michaelpumo/michaelpumo.github.io | vue | ## Code Before:
<template>
<component
:is="tag"
ref="element">
<slot />
</component>
</template>
<script>
import anime from 'animejs'
export default {
name: 'AppReveal',
className: 'AppReveal',
props: {
active: {
type: Boolean,
default: false
},
tag: {
type: String,
default: 'div'
},
delay: {
type: Number,
default: 0
}
},
computed: {
target() {
return this.$refs.element
}
},
watch: {
active(val) {
if (val) {
this.animationPlay()
} else {
this.animationReverse()
}
}
},
mounted() {
this.animationCreate()
},
methods: {
animationCreate() {
this.animation = anime.timeline({
loop: false,
autoplay: false
})
.add({
targets: this.target,
skew: ['-10deg', '0deg'],
translateX: [10, 0],
translateZ: 0,
opacity: [0, 1],
easing: 'easeInOutQuad',
duration: 500,
delay: 250 + this.delay
})
},
animationPlay() {
this.animation.play()
},
animationReverse() {
this.animation.reverse()
}
}
}
</script>
<style lang="scss" scoped>
.AppReveal {
opacity: 0;
}
</style>
## Instruction:
Add class to attach the styles to.
## Code After:
<template>
<component
:is="tag"
ref="element"
:class="$options.className">
<slot />
</component>
</template>
<script>
import anime from 'animejs'
export default {
name: 'AppReveal',
className: 'AppReveal',
props: {
active: {
type: Boolean,
default: false
},
tag: {
type: String,
default: 'div'
},
delay: {
type: Number,
default: 0
}
},
computed: {
target() {
return this.$refs.element
}
},
watch: {
active(val) {
if (val) {
this.animationPlay()
} else {
this.animationReverse()
}
}
},
mounted() {
this.animationCreate()
},
methods: {
animationCreate() {
this.animation = anime.timeline({
loop: false,
autoplay: false
})
.add({
targets: this.target,
skew: ['-10deg', '0deg'],
translateX: [10, 0],
translateZ: 0,
opacity: [0, 1],
easing: 'easeInOutQuad',
duration: 500,
delay: 250 + this.delay
})
},
animationPlay() {
this.animation.play()
},
animationReverse() {
this.animation.reverse()
}
}
}
</script>
<style lang="scss" scoped>
.AppReveal {
opacity: 0;
}
</style>
|
3edc20d576b57a0e30ed53d61c3ff960cb891853 | web/concrete/themes/dashboard/dashboard_primary_five.php | web/concrete/themes/dashboard/dashboard_primary_five.php | <?
defined('C5_EXECUTE') or die("Access Denied.");
$this->inc('elements/header.php', array('enableEditing' => true));
?>
<div class="ccm-ui">
<div class="newsflow" id="newsflow-main">
<? $this->inc('elements/header_newsflow.php'); ?>
<table class="newsflow-layout">
<tr>
<td class="newsflow-em1" style="width: 66%" colspan="2" rowspan="2">
<div id="ccm-dashboard-welcome-back">
<? $a = new Area('Primary'); $a->display($c); ?>
</div>
</td>
<td><? $a = new Area('Secondary 1'); $a->display($c); ?></td>
</tr>
<tr>
<td style="width: 34%"><? $a = new Area('Secondary 2'); $a->display($c); ?></td>
</tr>
<tr>
<td style="width: 33%"><? $a = new Area('Secondary 3'); $a->display($c); ?></td>
<td style="width: 33%"><? $a = new Area('Secondary 4'); $a->display($c); ?></td>
<td style="width: 34%"><? $a = new Area('Secondary 5'); $a->display($c); ?></td>
</tr>
</table>
</div>
</div>
<? $this->inc('elements/footer.php'); ?> | <?
defined('C5_EXECUTE') or die("Access Denied.");
$this->inc('elements/header.php', array('enableEditing' => true));
?>
<div class="ccm-ui">
<div class="newsflow" id="newsflow-main">
<? $this->inc('elements/header_newsflow.php'); ?>
<table class="newsflow-layout">
<tr>
<td class="newsflow-em1" style="width: 66%" rowspan="3">
<div id="ccm-dashboard-welcome-back">
<? $a = new Area('Primary'); $a->display($c); ?>
</div>
</td>
<td><? $a = new Area('Secondary 1'); $a->display($c); ?></td>
</tr>
<tr>
<td style="width: 34%"><? $a = new Area('Secondary 2'); $a->display($c); ?></td>
</tr>
<tr>
<td style="width: 34%"><? $a = new Area('Secondary 5'); $a->display($c); ?></td>
</tr>
</table>
</div>
</div>
<? $this->inc('elements/footer.php'); ?> | Fix broken newsflow on launch | Fix broken newsflow on launch
| PHP | mit | avdevs/concrete5,avdevs/concrete5,avdevs/concrete5,TimDix/concrete5,MichaelMaar/concrete5,MichaelMaar/concrete5,matt9mg/concrete5,TimDix/concrete5,matt9mg/concrete5,MichaelMaar/concrete5,TimDix/concrete5,matt9mg/concrete5 | php | ## Code Before:
<?
defined('C5_EXECUTE') or die("Access Denied.");
$this->inc('elements/header.php', array('enableEditing' => true));
?>
<div class="ccm-ui">
<div class="newsflow" id="newsflow-main">
<? $this->inc('elements/header_newsflow.php'); ?>
<table class="newsflow-layout">
<tr>
<td class="newsflow-em1" style="width: 66%" colspan="2" rowspan="2">
<div id="ccm-dashboard-welcome-back">
<? $a = new Area('Primary'); $a->display($c); ?>
</div>
</td>
<td><? $a = new Area('Secondary 1'); $a->display($c); ?></td>
</tr>
<tr>
<td style="width: 34%"><? $a = new Area('Secondary 2'); $a->display($c); ?></td>
</tr>
<tr>
<td style="width: 33%"><? $a = new Area('Secondary 3'); $a->display($c); ?></td>
<td style="width: 33%"><? $a = new Area('Secondary 4'); $a->display($c); ?></td>
<td style="width: 34%"><? $a = new Area('Secondary 5'); $a->display($c); ?></td>
</tr>
</table>
</div>
</div>
<? $this->inc('elements/footer.php'); ?>
## Instruction:
Fix broken newsflow on launch
## Code After:
<?
defined('C5_EXECUTE') or die("Access Denied.");
$this->inc('elements/header.php', array('enableEditing' => true));
?>
<div class="ccm-ui">
<div class="newsflow" id="newsflow-main">
<? $this->inc('elements/header_newsflow.php'); ?>
<table class="newsflow-layout">
<tr>
<td class="newsflow-em1" style="width: 66%" rowspan="3">
<div id="ccm-dashboard-welcome-back">
<? $a = new Area('Primary'); $a->display($c); ?>
</div>
</td>
<td><? $a = new Area('Secondary 1'); $a->display($c); ?></td>
</tr>
<tr>
<td style="width: 34%"><? $a = new Area('Secondary 2'); $a->display($c); ?></td>
</tr>
<tr>
<td style="width: 34%"><? $a = new Area('Secondary 5'); $a->display($c); ?></td>
</tr>
</table>
</div>
</div>
<? $this->inc('elements/footer.php'); ?> |
32a273f7e7de1864a83c64888d31667ee7fc5861 | app/overrides/retailer_admin_tab.rb | app/overrides/retailer_admin_tab.rb | Deface::Override.new(:virtual_path => "spree/layouts/admin",
:name => "add_retailers_admin_tab",
:insert_bottom => "[data-hook='admin_tabs'], #admin_tabs[data-hook]",
:text => "<%= tab(:retailers, :retailer_types, :local_stockists) %>",
:disabled => false,
:original => '031652cf5a054796022506622082ab6d2693699f') | Deface::Override.new(:virtual_path => "spree/admin/shared/_menu",
:name => "admin_retailers_tab",
:insert_bottom => "[data-hook='admin_tabs']",
:text => "<% if can? :admin, Spree::Admin::RetailersController %>
<%= tab :retailers, :url => spree.admin_retailers_path, :icon => 'globe' -%>
<% end %>",
:disabled => false)
Deface::Override.new(:virtual_path => "spree/layouts/admin",
:name => "add_retailers_admin_tab",
:insert_bottom => "[data-hook='admin_tabs'], #admin_tabs[data-hook]",
:text => "<%= tab(:retailers, :retailer_types, :local_stockists) %>",
:disabled => false,
:original => '031652cf5a054796022506622082ab6d2693699f') | Add retailer link in admin main menu | Add retailer link in admin main menu
| Ruby | bsd-3-clause | bricesanchez/spree_retailers,bricesanchez/spree_retailers | ruby | ## Code Before:
Deface::Override.new(:virtual_path => "spree/layouts/admin",
:name => "add_retailers_admin_tab",
:insert_bottom => "[data-hook='admin_tabs'], #admin_tabs[data-hook]",
:text => "<%= tab(:retailers, :retailer_types, :local_stockists) %>",
:disabled => false,
:original => '031652cf5a054796022506622082ab6d2693699f')
## Instruction:
Add retailer link in admin main menu
## Code After:
Deface::Override.new(:virtual_path => "spree/admin/shared/_menu",
:name => "admin_retailers_tab",
:insert_bottom => "[data-hook='admin_tabs']",
:text => "<% if can? :admin, Spree::Admin::RetailersController %>
<%= tab :retailers, :url => spree.admin_retailers_path, :icon => 'globe' -%>
<% end %>",
:disabled => false)
Deface::Override.new(:virtual_path => "spree/layouts/admin",
:name => "add_retailers_admin_tab",
:insert_bottom => "[data-hook='admin_tabs'], #admin_tabs[data-hook]",
:text => "<%= tab(:retailers, :retailer_types, :local_stockists) %>",
:disabled => false,
:original => '031652cf5a054796022506622082ab6d2693699f') |
6850d0dfdfbd4e4557da81857237d9d597752268 | setup.py | setup.py |
from setuptools import setup
setup(
name='aubreylib',
version='1.2.2',
description='A helper library for the Aubrey access system.',
author='University of North Texas Libraries',
author_email='mark.phillips@unt.edu',
url='https://github.com/unt-libraries/aubreylib',
license='BSD',
packages=['aubreylib'],
install_requires=[
'lxml>=3.3.3',
'pyuntl>=1.0.1',
'pypairtree>=1.0.0',
],
classifiers=[
'Intended Audience :: Developers',
'Natural Language :: English',
'Programming Language :: Python',
'Programming Language :: Python :: 3.7',
]
)
|
from setuptools import setup
setup(
name='aubreylib',
version='1.2.2',
description='A helper library for the Aubrey access system.',
author='University of North Texas Libraries',
author_email='mark.phillips@unt.edu',
url='https://github.com/unt-libraries/aubreylib',
license='BSD',
packages=['aubreylib'],
install_requires=[
'lxml>=3.3.3',
'pypairtree>=1.0.0',
],
classifiers=[
'Intended Audience :: Developers',
'Natural Language :: English',
'Programming Language :: Python',
'Programming Language :: Python :: 3.7',
]
)
| Move pyuntl dependency from here to requirements-test.txt file. | Move pyuntl dependency from here to requirements-test.txt file.
| Python | bsd-3-clause | unt-libraries/aubreylib | python | ## Code Before:
from setuptools import setup
setup(
name='aubreylib',
version='1.2.2',
description='A helper library for the Aubrey access system.',
author='University of North Texas Libraries',
author_email='mark.phillips@unt.edu',
url='https://github.com/unt-libraries/aubreylib',
license='BSD',
packages=['aubreylib'],
install_requires=[
'lxml>=3.3.3',
'pyuntl>=1.0.1',
'pypairtree>=1.0.0',
],
classifiers=[
'Intended Audience :: Developers',
'Natural Language :: English',
'Programming Language :: Python',
'Programming Language :: Python :: 3.7',
]
)
## Instruction:
Move pyuntl dependency from here to requirements-test.txt file.
## Code After:
from setuptools import setup
setup(
name='aubreylib',
version='1.2.2',
description='A helper library for the Aubrey access system.',
author='University of North Texas Libraries',
author_email='mark.phillips@unt.edu',
url='https://github.com/unt-libraries/aubreylib',
license='BSD',
packages=['aubreylib'],
install_requires=[
'lxml>=3.3.3',
'pypairtree>=1.0.0',
],
classifiers=[
'Intended Audience :: Developers',
'Natural Language :: English',
'Programming Language :: Python',
'Programming Language :: Python :: 3.7',
]
)
|
487d3f2853918937e26312be4e25a2fc6118e41b | src/devices/incoming_bot_notification.ts | src/devices/incoming_bot_notification.ts | import { beep } from "../util";
import { Notification } from "farmbot/dist/jsonrpc";
import {
HardwareState,
RpcBotLog
} from "../devices/interfaces";
import { error, success, warning } from "../ui";
import { t } from "i18next";
export function handleIncomingBotNotification(msg: Notification<any>,
dispatch: Function) {
switch (msg.method) {
case "status_update":
dispatch(statusUpdate((msg as Notification<[HardwareState]>)));
beep();
break;
case "log_message":
handleLogMessage(dispatch, (msg as Notification<[RpcBotLog]>));
break;
case "log_dump":
dispatch(logDump(msg as Notification<RpcBotLog[]>));
break;
};
}
function handleLogMessage(dispatch: Function,
message: Notification<[RpcBotLog]>) {
dispatch(logNotification(message));
}
function statusUpdate(statusMessage: Notification<[HardwareState]>) {
return {
type: "BOT_CHANGE",
payload: statusMessage.params[0]
};
}
function logNotification(botLog:
Notification<[RpcBotLog]>) {
return {
type: "BOT_LOG",
payload: botLog.params[0]
};
};
function logDump(msgs: Notification<RpcBotLog[]>) {
return {
type: "BOT_LOG_DUMP",
payload: msgs
};
}
| import { beep } from "../util";
import { Notification } from "farmbot/dist/jsonrpc";
import {
HardwareState,
RpcBotLog
} from "../devices/interfaces";
import { error, success, warning } from "../ui";
import { t } from "i18next";
export function handleIncomingBotNotification(msg: Notification<any>,
dispatch: Function) {
switch (msg.method) {
case "status_update":
dispatch(statusUpdate((msg as Notification<[HardwareState]>)));
beep();
break;
case "log_message":
dispatch(logNotification(msg));
break;
};
}
function statusUpdate(statusMessage: Notification<[HardwareState]>) {
return {
type: "BOT_CHANGE",
payload: statusMessage.params[0]
};
}
function logNotification(botLog:
Notification<[RpcBotLog]>) {
return {
type: "BOT_LOG",
payload: botLog.params[0]
};
};
function logDump(msgs: Notification<RpcBotLog[]>) {
return {
type: "BOT_LOG_DUMP",
payload: msgs
};
}
| Remove unneeded LOG_DUMP switch branch | Remove unneeded LOG_DUMP switch branch
| TypeScript | mit | MrChristofferson/farmbot-web-frontend,MrChristofferson/farmbot-web-frontend,FarmBot/farmbot-web-frontend,RickCarlino/farmbot-web-frontend,RickCarlino/farmbot-web-frontend,FarmBot/farmbot-web-frontend | typescript | ## Code Before:
import { beep } from "../util";
import { Notification } from "farmbot/dist/jsonrpc";
import {
HardwareState,
RpcBotLog
} from "../devices/interfaces";
import { error, success, warning } from "../ui";
import { t } from "i18next";
export function handleIncomingBotNotification(msg: Notification<any>,
dispatch: Function) {
switch (msg.method) {
case "status_update":
dispatch(statusUpdate((msg as Notification<[HardwareState]>)));
beep();
break;
case "log_message":
handleLogMessage(dispatch, (msg as Notification<[RpcBotLog]>));
break;
case "log_dump":
dispatch(logDump(msg as Notification<RpcBotLog[]>));
break;
};
}
function handleLogMessage(dispatch: Function,
message: Notification<[RpcBotLog]>) {
dispatch(logNotification(message));
}
function statusUpdate(statusMessage: Notification<[HardwareState]>) {
return {
type: "BOT_CHANGE",
payload: statusMessage.params[0]
};
}
function logNotification(botLog:
Notification<[RpcBotLog]>) {
return {
type: "BOT_LOG",
payload: botLog.params[0]
};
};
function logDump(msgs: Notification<RpcBotLog[]>) {
return {
type: "BOT_LOG_DUMP",
payload: msgs
};
}
## Instruction:
Remove unneeded LOG_DUMP switch branch
## Code After:
import { beep } from "../util";
import { Notification } from "farmbot/dist/jsonrpc";
import {
HardwareState,
RpcBotLog
} from "../devices/interfaces";
import { error, success, warning } from "../ui";
import { t } from "i18next";
export function handleIncomingBotNotification(msg: Notification<any>,
dispatch: Function) {
switch (msg.method) {
case "status_update":
dispatch(statusUpdate((msg as Notification<[HardwareState]>)));
beep();
break;
case "log_message":
dispatch(logNotification(msg));
break;
};
}
function statusUpdate(statusMessage: Notification<[HardwareState]>) {
return {
type: "BOT_CHANGE",
payload: statusMessage.params[0]
};
}
function logNotification(botLog:
Notification<[RpcBotLog]>) {
return {
type: "BOT_LOG",
payload: botLog.params[0]
};
};
function logDump(msgs: Notification<RpcBotLog[]>) {
return {
type: "BOT_LOG_DUMP",
payload: msgs
};
}
|
f2497db0b251fab94455f1bd4377e459e93583ad | lib/honey_format/header.rb | lib/honey_format/header.rb | require 'honey_format/columns'
module HoneyFormat
# Represents a header
class Header
attr_reader :column_names
# @return [Header] a new instance of Header.
# @param [Array] header array of strings.
# @param [Array] valid array of symbols representing valid columns.
# @raise [MissingCSVHeaderError] raised when header is missing (empty or nil).
def initialize(header, valid: :all, converter: ConvertHeaderValue)
@column_names = build_header(header)
@columns = Columns.new(@column_names, valid: valid, converter: converter)
end
# Returns columns as array.
# @return [Array] of columns.
def columns
@columns.to_a
end
private
def build_header(header)
if header.nil? || header.empty?
fail(MissingCSVHeaderError, "CSV header can't be empty.")
end
Sanitize.array!(header)
end
end
end
| require 'honey_format/columns'
module HoneyFormat
# Represents a header
class Header
attr_reader :column_names
# @return [Header] a new instance of Header.
# @param [Array] header array of strings.
# @param [Array] valid array of symbols representing valid columns.
# @raise [MissingCSVHeaderError] raised when header is missing (empty or nil).
def initialize(header, valid: :all, converter: ConvertHeaderValue)
if header.nil? || header.empty?
fail(MissingCSVHeaderError, "CSV header can't be empty.")
end
@column_names = header
@columns = Columns.new(header, valid: valid, converter: converter)
end
# Returns columns as array.
# @return [Array] of columns.
def columns
@columns.to_a
end
end
end
| Remove unneeded method and call to Sanitize::array | Remove unneeded method and call to Sanitize::array
| Ruby | mit | buren/honey_format,buren/honey_format | ruby | ## Code Before:
require 'honey_format/columns'
module HoneyFormat
# Represents a header
class Header
attr_reader :column_names
# @return [Header] a new instance of Header.
# @param [Array] header array of strings.
# @param [Array] valid array of symbols representing valid columns.
# @raise [MissingCSVHeaderError] raised when header is missing (empty or nil).
def initialize(header, valid: :all, converter: ConvertHeaderValue)
@column_names = build_header(header)
@columns = Columns.new(@column_names, valid: valid, converter: converter)
end
# Returns columns as array.
# @return [Array] of columns.
def columns
@columns.to_a
end
private
def build_header(header)
if header.nil? || header.empty?
fail(MissingCSVHeaderError, "CSV header can't be empty.")
end
Sanitize.array!(header)
end
end
end
## Instruction:
Remove unneeded method and call to Sanitize::array
## Code After:
require 'honey_format/columns'
module HoneyFormat
# Represents a header
class Header
attr_reader :column_names
# @return [Header] a new instance of Header.
# @param [Array] header array of strings.
# @param [Array] valid array of symbols representing valid columns.
# @raise [MissingCSVHeaderError] raised when header is missing (empty or nil).
def initialize(header, valid: :all, converter: ConvertHeaderValue)
if header.nil? || header.empty?
fail(MissingCSVHeaderError, "CSV header can't be empty.")
end
@column_names = header
@columns = Columns.new(header, valid: valid, converter: converter)
end
# Returns columns as array.
# @return [Array] of columns.
def columns
@columns.to_a
end
end
end
|
a6cb8042cd7f9bbdfc42c84ef48269eaf30e0a7a | package.json | package.json | {
"name": "jenkins-metrics",
"version": "0.0.1",
"description": "Grabs data from Jenkins",
"main": "lib/app.js",
"directories": {
"lib": "lib"
},
"keywords": [
"jenkins",
"metrics"
],
"license": "MIT",
"dependencies": {
"bluebird": "^3.3.5",
"config": "^1.20.1",
"request-promise": "^3.0.0"
},
"devDependencies": {
"babel-cli": "^6.8.0",
"babel-preset-es2015": "^6.6.0",
"gulp-babel": "^6.1.2"
}
}
| {
"name": "jenkins-metrics",
"version": "0.0.1",
"description": "Grabs data from Jenkins",
"main": "lib/app.js",
"directories": {
"lib": "lib"
},
"keywords": [
"jenkins",
"metrics"
],
"license": "MIT",
"dependencies": {
"bluebird": "^3.3.5",
"config": "^1.20.1",
"request-promise": "^3.0.0",
"rethinkdb": "^2.3.1"
}
}
| Remove unneeded dev deps (install them as globals instead, based on used ide) | Remove unneeded dev deps (install them as globals instead, based on used ide)
| JSON | mit | TOPdesk/jenkins-metrics-collector | json | ## Code Before:
{
"name": "jenkins-metrics",
"version": "0.0.1",
"description": "Grabs data from Jenkins",
"main": "lib/app.js",
"directories": {
"lib": "lib"
},
"keywords": [
"jenkins",
"metrics"
],
"license": "MIT",
"dependencies": {
"bluebird": "^3.3.5",
"config": "^1.20.1",
"request-promise": "^3.0.0"
},
"devDependencies": {
"babel-cli": "^6.8.0",
"babel-preset-es2015": "^6.6.0",
"gulp-babel": "^6.1.2"
}
}
## Instruction:
Remove unneeded dev deps (install them as globals instead, based on used ide)
## Code After:
{
"name": "jenkins-metrics",
"version": "0.0.1",
"description": "Grabs data from Jenkins",
"main": "lib/app.js",
"directories": {
"lib": "lib"
},
"keywords": [
"jenkins",
"metrics"
],
"license": "MIT",
"dependencies": {
"bluebird": "^3.3.5",
"config": "^1.20.1",
"request-promise": "^3.0.0",
"rethinkdb": "^2.3.1"
}
}
|
2f3868cd68aa27b48d3d5a2fecdf24336915b7bb | composer.json | composer.json | {
"name": "react/filesystem",
"description": "Asynchronous filesystem abstraction.",
"keywords": ["filesystem", "asynchronous", "eio"],
"license": "MIT",
"authors": [
{
"name": "Cees-Jan Kiewiet",
"email": "ceesjank@gmail.com"
}
],
"require": {
"php": ">=5.4.0",
"react/event-loop": "^0.4",
"react/promise": "~2.2",
"react/stream": "^0.4",
"evenement/evenement": "~2.0",
"wyrihaximus/react-child-process-pool": "^1.3"
},
"require-dev": {
"clue/block-react": "^1.1",
"phpunit/phpunit": "^6.0 || ^5.0 || ^4.8"
},
"suggest": {
"ext-eio": "^1.2"
},
"autoload": {
"psr-4": {
"React\\Filesystem\\": "src/"
},
"files": ["src/functions_include.php"]
},
"autoload-dev": {
"psr-4": {
"React\\Tests\\Filesystem\\": "tests/"
}
},
"config": {
"sort-packages": true,
"platform": {
"php": "5.4"
}
}
}
| {
"name": "react/filesystem",
"description": "Asynchronous filesystem abstraction.",
"keywords": ["filesystem", "asynchronous", "eio"],
"license": "MIT",
"authors": [
{
"name": "Cees-Jan Kiewiet",
"email": "ceesjank@gmail.com"
}
],
"require": {
"php": ">=5.4.0",
"evenement/evenement": "^3.0 || ^2.0",
"react/event-loop": "^0.4",
"react/promise": "~2.2",
"react/stream": "^0.4",
"wyrihaximus/react-child-process-pool": "^1.3"
},
"require-dev": {
"clue/block-react": "^1.1",
"phpunit/phpunit": "^6.0 || ^5.0 || ^4.8"
},
"suggest": {
"ext-eio": "^1.2"
},
"autoload": {
"psr-4": {
"React\\Filesystem\\": "src/"
},
"files": ["src/functions_include.php"]
},
"autoload-dev": {
"psr-4": {
"React\\Tests\\Filesystem\\": "tests/"
}
},
"config": {
"sort-packages": true,
"platform": {
"php": "5.4"
}
}
}
| Support both evenement 3.0 and 2.0 | Support both evenement 3.0 and 2.0
| JSON | mit | reactphp/filesystem,reactphp/filesystem | json | ## Code Before:
{
"name": "react/filesystem",
"description": "Asynchronous filesystem abstraction.",
"keywords": ["filesystem", "asynchronous", "eio"],
"license": "MIT",
"authors": [
{
"name": "Cees-Jan Kiewiet",
"email": "ceesjank@gmail.com"
}
],
"require": {
"php": ">=5.4.0",
"react/event-loop": "^0.4",
"react/promise": "~2.2",
"react/stream": "^0.4",
"evenement/evenement": "~2.0",
"wyrihaximus/react-child-process-pool": "^1.3"
},
"require-dev": {
"clue/block-react": "^1.1",
"phpunit/phpunit": "^6.0 || ^5.0 || ^4.8"
},
"suggest": {
"ext-eio": "^1.2"
},
"autoload": {
"psr-4": {
"React\\Filesystem\\": "src/"
},
"files": ["src/functions_include.php"]
},
"autoload-dev": {
"psr-4": {
"React\\Tests\\Filesystem\\": "tests/"
}
},
"config": {
"sort-packages": true,
"platform": {
"php": "5.4"
}
}
}
## Instruction:
Support both evenement 3.0 and 2.0
## Code After:
{
"name": "react/filesystem",
"description": "Asynchronous filesystem abstraction.",
"keywords": ["filesystem", "asynchronous", "eio"],
"license": "MIT",
"authors": [
{
"name": "Cees-Jan Kiewiet",
"email": "ceesjank@gmail.com"
}
],
"require": {
"php": ">=5.4.0",
"evenement/evenement": "^3.0 || ^2.0",
"react/event-loop": "^0.4",
"react/promise": "~2.2",
"react/stream": "^0.4",
"wyrihaximus/react-child-process-pool": "^1.3"
},
"require-dev": {
"clue/block-react": "^1.1",
"phpunit/phpunit": "^6.0 || ^5.0 || ^4.8"
},
"suggest": {
"ext-eio": "^1.2"
},
"autoload": {
"psr-4": {
"React\\Filesystem\\": "src/"
},
"files": ["src/functions_include.php"]
},
"autoload-dev": {
"psr-4": {
"React\\Tests\\Filesystem\\": "tests/"
}
},
"config": {
"sort-packages": true,
"platform": {
"php": "5.4"
}
}
}
|
d04a66bd09271208baafe30733645c30e9f09675 | source/documentacao/index.html.erb | source/documentacao/index.html.erb | ---
title: Documentação
layout: mobile
description: Use direto dos nossos servidores ou baixe uma versão estável para incorporar em seu projeto ou usar offline.
---
<script>
window.location.href = '/documentacao/introducao/';
</script>
| ---
title: Documentação
layout: mobile
description: Use direto dos nossos servidores ou baixe uma versão estável para incorporar em seu projeto ou usar offline.
---
<%= partial 'documentacao/shared/passo-a-passo/passo' %>
| Revert "Change /documentacao/ redict to /documentacao/introducao/" | Revert "Change /documentacao/ redict to /documentacao/introducao/"
This reverts commit 72cb04753d4c11bf47a1f9f757dcdc40ae8877ae.
| HTML+ERB | mit | deividmarques/locawebstyle,locaweb/locawebstyle,locaweb/locawebstyle,deividmarques/locawebstyle,deividmarques/locawebstyle,locaweb/locawebstyle | html+erb | ## Code Before:
---
title: Documentação
layout: mobile
description: Use direto dos nossos servidores ou baixe uma versão estável para incorporar em seu projeto ou usar offline.
---
<script>
window.location.href = '/documentacao/introducao/';
</script>
## Instruction:
Revert "Change /documentacao/ redict to /documentacao/introducao/"
This reverts commit 72cb04753d4c11bf47a1f9f757dcdc40ae8877ae.
## Code After:
---
title: Documentação
layout: mobile
description: Use direto dos nossos servidores ou baixe uma versão estável para incorporar em seu projeto ou usar offline.
---
<%= partial 'documentacao/shared/passo-a-passo/passo' %>
|
08720bd1e029fc6f5c35ebe049c8cb3573182277 | CHANGELOG-4.x.md | CHANGELOG-4.x.md | 4.0.0 (unreleased)
------------------
* Drop support for PHP < 8.0
* Add support for Symfony 6
* Drop support for Symfony 4
* `Mogrify` annotation is now a PHP 8 attribute
| 4.0.0
-----
* Drop support for PHP < 8.0
* Add support for Symfony 6
* Drop support for Symfony 4 & Symfony < 5.4
* `Mogrify` annotation is now a PHP 8 attribute
| Update changelog for 4.0.0 release | Update changelog for 4.0.0 release
| Markdown | mit | leapt/im-bundle | markdown | ## Code Before:
4.0.0 (unreleased)
------------------
* Drop support for PHP < 8.0
* Add support for Symfony 6
* Drop support for Symfony 4
* `Mogrify` annotation is now a PHP 8 attribute
## Instruction:
Update changelog for 4.0.0 release
## Code After:
4.0.0
-----
* Drop support for PHP < 8.0
* Add support for Symfony 6
* Drop support for Symfony 4 & Symfony < 5.4
* `Mogrify` annotation is now a PHP 8 attribute
|
4761c8f016e9cbbe655c802875a5efe5fc8c11a0 | README.md | README.md | Yandex translate module inside a map-stream.
## Usage
``` js
var translate = require('yandex-translate-stream')(<your yandex api key here>)
var data = {
translate: {
data: 'Hola',
options: {
from: 'es',
to: 'fr'
}
}
}
// Will emit the same object with a translated property that equals '[bonjour]'
translate.write(data)
```
If the incomming data is anything but an object or the object doesn't have a 'translate' property, it behaves as a pass throwgh emmiting data unmodified without making any api calls.
| Yandex translate module inside a map-stream.
[![js-standard-style](https://cdn.rawgit.com/feross/standard/master/badge.svg)](https://github.com/feross/standard)
## Usage
``` js
var translate = require('yandex-translate-stream')(<your yandex api key here>)
var data = {
translate: {
data: 'Hola',
options: {
from: 'es',
to: 'fr'
}
}
}
// Will emit the same object with a translated property that equals '[bonjour]'
translate.write(data)
```
If the incomming data is anything but an object or the object doesn't have a 'translate' property, it behaves as a pass throwgh emmiting data unmodified without making any api calls.
| Fix code style and add standard badge. | Fix code style and add standard badge.
| Markdown | mit | santiagogil/yandex-translate-stream | markdown | ## Code Before:
Yandex translate module inside a map-stream.
## Usage
``` js
var translate = require('yandex-translate-stream')(<your yandex api key here>)
var data = {
translate: {
data: 'Hola',
options: {
from: 'es',
to: 'fr'
}
}
}
// Will emit the same object with a translated property that equals '[bonjour]'
translate.write(data)
```
If the incomming data is anything but an object or the object doesn't have a 'translate' property, it behaves as a pass throwgh emmiting data unmodified without making any api calls.
## Instruction:
Fix code style and add standard badge.
## Code After:
Yandex translate module inside a map-stream.
[![js-standard-style](https://cdn.rawgit.com/feross/standard/master/badge.svg)](https://github.com/feross/standard)
## Usage
``` js
var translate = require('yandex-translate-stream')(<your yandex api key here>)
var data = {
translate: {
data: 'Hola',
options: {
from: 'es',
to: 'fr'
}
}
}
// Will emit the same object with a translated property that equals '[bonjour]'
translate.write(data)
```
If the incomming data is anything but an object or the object doesn't have a 'translate' property, it behaves as a pass throwgh emmiting data unmodified without making any api calls.
|
de0933b3ca5edc55cc4c9bd78c6c98896d584984 | .travis.yml | .travis.yml | language: python
env:
- DJANGO="Django>=1.8,<1.9"
- DJANGO="Django>=1.9,<1.10"
- DJANGO="Django>=1.10,<1.11"
- DJANGO="Django>=1.11,<1.12"
python:
- "2.7"
- "3.4"
- "3.5"
- "3.6"
install:
- pip install --upgrade pip setuptools
- pip install $DJANGO
- pip install .
- pip install coveralls
- pip install coverage
script:
- coverage run --source mezzanine setup.py test
notifications:
irc: "irc.freenode.org#mezzanine"
on_success: change
on_failure: change
after_success: coveralls
| language: python
env:
- DJANGO="Django>=1.8,<1.9"
- DJANGO="Django>=1.9,<1.10"
- DJANGO="Django>=1.10,<1.11"
- DJANGO="Django>=1.11,<1.12"
python:
- "2.7"
- "3.4"
- "3.5"
- "3.6"
install:
- pip install --upgrade pip setuptools
- pip install $DJANGO
- pip install .
notifications:
irc: "irc.freenode.org#mezzanine"
on_success: change
on_failure: change
| Remove thing that doesn't understand dependencies. | Remove thing that doesn't understand dependencies.
| YAML | bsd-2-clause | frankier/mezzanine,dsanders11/mezzanine,stephenmcd/mezzanine,frankier/mezzanine,molokov/mezzanine,stephenmcd/mezzanine,jerivas/mezzanine,dsanders11/mezzanine,molokov/mezzanine,readevalprint/mezzanine,readevalprint/mezzanine,christianwgd/mezzanine,dsanders11/mezzanine,molokov/mezzanine,readevalprint/mezzanine,stephenmcd/mezzanine,jerivas/mezzanine,christianwgd/mezzanine,jerivas/mezzanine,frankier/mezzanine,christianwgd/mezzanine | yaml | ## Code Before:
language: python
env:
- DJANGO="Django>=1.8,<1.9"
- DJANGO="Django>=1.9,<1.10"
- DJANGO="Django>=1.10,<1.11"
- DJANGO="Django>=1.11,<1.12"
python:
- "2.7"
- "3.4"
- "3.5"
- "3.6"
install:
- pip install --upgrade pip setuptools
- pip install $DJANGO
- pip install .
- pip install coveralls
- pip install coverage
script:
- coverage run --source mezzanine setup.py test
notifications:
irc: "irc.freenode.org#mezzanine"
on_success: change
on_failure: change
after_success: coveralls
## Instruction:
Remove thing that doesn't understand dependencies.
## Code After:
language: python
env:
- DJANGO="Django>=1.8,<1.9"
- DJANGO="Django>=1.9,<1.10"
- DJANGO="Django>=1.10,<1.11"
- DJANGO="Django>=1.11,<1.12"
python:
- "2.7"
- "3.4"
- "3.5"
- "3.6"
install:
- pip install --upgrade pip setuptools
- pip install $DJANGO
- pip install .
notifications:
irc: "irc.freenode.org#mezzanine"
on_success: change
on_failure: change
|
005bc4fb9fb8baa3939d86f115fc1c930f7aa75d | models/payfasttoken_model.js | models/payfasttoken_model.js | var mongoose = require('mongoose');
var Schema = mongoose.Schema;
var ObjectId = mongoose.Schema.Types.ObjectId;
var User = require("./user_model");
var Organisation = require("./organisation_model");
var PayfasttokenSchema = new Schema({
token: { type: String, index: true, unique: true },
user_id: { type: ObjectId, ref: 'User', index: true },
organisation_id: { type: ObjectId, ref: 'Organisation', index: true },
card_number: String,
expiration_date: { type: Date, index: true },
date_created: { type: Date, default: Date.now },
_owner_id: ObjectId,
_deleted: { type: Boolean, default: false, index: true },
});
PayfasttokenSchema.set("_perms", {
super_user: "crud",
admin: "r",
owner: "r",
});
module.exports = mongoose.model('Payfasttoken', PayfasttokenSchema); | var mongoose = require('mongoose');
var Schema = mongoose.Schema;
var ObjectId = mongoose.Schema.Types.ObjectId;
var Mixed = mongoose.Schema.Types.Mixed;
var User = require("./user_model");
var Organisation = require("./organisation_model");
var PayfasttokenSchema = new Schema({
token: { type: String, index: true, unique: true },
user_id: { type: ObjectId, ref: 'User', index: true },
organisation_id: { type: ObjectId, ref: 'Organisation', index: true },
card_number: String,
expiration_date: { type: Date, index: true },
name_first: String,
name_last: String,
email: String,
payfast_response: Mixed,
date_created: { type: Date, default: Date.now },
_owner_id: ObjectId,
_deleted: { type: Boolean, default: false, index: true },
});
PayfasttokenSchema.set("_perms", {
admin: "crud",
owner: "rd",
});
module.exports = mongoose.model('Payfasttoken', PayfasttokenSchema); | Change some permissions and store some more details from Payfast | Change some permissions and store some more details from Payfast
| JavaScript | mit | 10layer/jexpress,10layer/jexpress | javascript | ## Code Before:
var mongoose = require('mongoose');
var Schema = mongoose.Schema;
var ObjectId = mongoose.Schema.Types.ObjectId;
var User = require("./user_model");
var Organisation = require("./organisation_model");
var PayfasttokenSchema = new Schema({
token: { type: String, index: true, unique: true },
user_id: { type: ObjectId, ref: 'User', index: true },
organisation_id: { type: ObjectId, ref: 'Organisation', index: true },
card_number: String,
expiration_date: { type: Date, index: true },
date_created: { type: Date, default: Date.now },
_owner_id: ObjectId,
_deleted: { type: Boolean, default: false, index: true },
});
PayfasttokenSchema.set("_perms", {
super_user: "crud",
admin: "r",
owner: "r",
});
module.exports = mongoose.model('Payfasttoken', PayfasttokenSchema);
## Instruction:
Change some permissions and store some more details from Payfast
## Code After:
var mongoose = require('mongoose');
var Schema = mongoose.Schema;
var ObjectId = mongoose.Schema.Types.ObjectId;
var Mixed = mongoose.Schema.Types.Mixed;
var User = require("./user_model");
var Organisation = require("./organisation_model");
var PayfasttokenSchema = new Schema({
token: { type: String, index: true, unique: true },
user_id: { type: ObjectId, ref: 'User', index: true },
organisation_id: { type: ObjectId, ref: 'Organisation', index: true },
card_number: String,
expiration_date: { type: Date, index: true },
name_first: String,
name_last: String,
email: String,
payfast_response: Mixed,
date_created: { type: Date, default: Date.now },
_owner_id: ObjectId,
_deleted: { type: Boolean, default: false, index: true },
});
PayfasttokenSchema.set("_perms", {
admin: "crud",
owner: "rd",
});
module.exports = mongoose.model('Payfasttoken', PayfasttokenSchema); |
5308a8aa38973009530b9073e13c1f9eca2ba449 | R/prepare_data.r | R/prepare_data.r | prepare_paco_data <- function(H, P, HP)
{
if(NROW(H) != NCOL(H))
stop("H should be a square matrix")
if(NROW(P) != NCOL(P))
stop("P should be a square matrix")
if(NROW(H) != NROW(HP)){
warning("The HP matrix should have hosts in rows. It has been translated.")
HP <- t(HP)
}
H <- H[rownames(HP),rownames(HP)]
P <- P[colnames(HP),colnames(HP)]
HP[HP>0] <- 1
return(list(H=H, P=P, HP=HP))
}
| prepare_paco_data <- function(H, P, HP)
{
if(NROW(H) != NCOL(H))
stop("H should be a square matrix")
if(NROW(P) != NCOL(P))
stop("P should be a square matrix")
if(NROW(H) != NROW(HP)){
warning("The HP matrix should have hosts in rows. It has been translated.")
HP <- t(HP)
}
if (!(NROW (H) %in% dim(HP)))
stop ("The number of species in H and HP don't match")
if (!(NROW (P) %in% dim(HP)))
stop ("The number of species in P and HP don't match")
H <- H[rownames(HP),rownames(HP)]
P <- P[colnames(HP),colnames(HP)]
HP[HP>0] <- 1
return(list(H=H, P=P, HP=HP))
}
| Add informative error when matrix size is diff | Add informative error when matrix size is diff
When the number of species in the Host or Parasite phylogeny is
different to the number of species on the interaction matrix it used to
spit the “subscript out of bounds” error. Now it tells the actual
problem
| R | mpl-2.0 | efcaguab/paco | r | ## Code Before:
prepare_paco_data <- function(H, P, HP)
{
if(NROW(H) != NCOL(H))
stop("H should be a square matrix")
if(NROW(P) != NCOL(P))
stop("P should be a square matrix")
if(NROW(H) != NROW(HP)){
warning("The HP matrix should have hosts in rows. It has been translated.")
HP <- t(HP)
}
H <- H[rownames(HP),rownames(HP)]
P <- P[colnames(HP),colnames(HP)]
HP[HP>0] <- 1
return(list(H=H, P=P, HP=HP))
}
## Instruction:
Add informative error when matrix size is diff
When the number of species in the Host or Parasite phylogeny is
different to the number of species on the interaction matrix it used to
spit the “subscript out of bounds” error. Now it tells the actual
problem
## Code After:
prepare_paco_data <- function(H, P, HP)
{
if(NROW(H) != NCOL(H))
stop("H should be a square matrix")
if(NROW(P) != NCOL(P))
stop("P should be a square matrix")
if(NROW(H) != NROW(HP)){
warning("The HP matrix should have hosts in rows. It has been translated.")
HP <- t(HP)
}
if (!(NROW (H) %in% dim(HP)))
stop ("The number of species in H and HP don't match")
if (!(NROW (P) %in% dim(HP)))
stop ("The number of species in P and HP don't match")
H <- H[rownames(HP),rownames(HP)]
P <- P[colnames(HP),colnames(HP)]
HP[HP>0] <- 1
return(list(H=H, P=P, HP=HP))
}
|
a9bd2b9ce9f4e969c9877ab452e8ae1167858828 | .travis.yml | .travis.yml | sudo: false
language: python
cache: pip
python:
- 2.7
- 3.4
- 3.5
- 3.6
addons:
postgresql: "9.4"
env:
- BLINKER=1 DATABASE_URI=postgresql://localhost/travis_ci_test
- BLINKER=1 DATABASE_URI=sqlite://
- BLINKER=0 DATABASE_URI=sqlite://
branches:
only:
- master
before_script:
- if [[ $DATABASE_URI == postgresql* ]]; then psql -c 'create database travis_ci_test;' -U postgres; fi
install:
- travis_retry pip install -r requirements.txt
- travis_retry pip install -r dev-requirements.txt
- if [[ $BLINKER == 1 ]]; then travis_retry pip install blinker; fi
- if [[ $DATABASE_URI == postgresql* ]]; then travis_retry pip install psycopg2; fi
- travis_retry pip install codecov
- pip install -e .
script:
- py.test --cov=flask_dance
after_success:
- codecov
| sudo: false
language: python
cache: pip
python:
- 2.7
- 3.4
- 3.5
- 3.6
addons:
postgresql: "9.4"
env:
- BLINKER=1 DATABASE_URI=postgresql://localhost/travis_ci_test
- BLINKER=1 DATABASE_URI=sqlite://
- BLINKER=0 DATABASE_URI=sqlite://
branches:
only:
- master
before_script:
- if [[ $DATABASE_URI == postgresql* ]]; then psql -c 'create database travis_ci_test;' -U postgres; fi
install:
- travis_retry pip install -U pytest
- travis_retry pip install -r requirements.txt
- travis_retry pip install -r dev-requirements.txt
- if [[ $BLINKER == 1 ]]; then travis_retry pip install blinker; fi
- if [[ $DATABASE_URI == postgresql* ]]; then travis_retry pip install psycopg2; fi
- travis_retry pip install codecov
- pip install -e .
script:
- py.test --cov=flask_dance
after_success:
- codecov
| Upgrade pytest on Travis CI | Upgrade pytest on Travis CI
| YAML | mit | singingwolfboy/flask-dance | yaml | ## Code Before:
sudo: false
language: python
cache: pip
python:
- 2.7
- 3.4
- 3.5
- 3.6
addons:
postgresql: "9.4"
env:
- BLINKER=1 DATABASE_URI=postgresql://localhost/travis_ci_test
- BLINKER=1 DATABASE_URI=sqlite://
- BLINKER=0 DATABASE_URI=sqlite://
branches:
only:
- master
before_script:
- if [[ $DATABASE_URI == postgresql* ]]; then psql -c 'create database travis_ci_test;' -U postgres; fi
install:
- travis_retry pip install -r requirements.txt
- travis_retry pip install -r dev-requirements.txt
- if [[ $BLINKER == 1 ]]; then travis_retry pip install blinker; fi
- if [[ $DATABASE_URI == postgresql* ]]; then travis_retry pip install psycopg2; fi
- travis_retry pip install codecov
- pip install -e .
script:
- py.test --cov=flask_dance
after_success:
- codecov
## Instruction:
Upgrade pytest on Travis CI
## Code After:
sudo: false
language: python
cache: pip
python:
- 2.7
- 3.4
- 3.5
- 3.6
addons:
postgresql: "9.4"
env:
- BLINKER=1 DATABASE_URI=postgresql://localhost/travis_ci_test
- BLINKER=1 DATABASE_URI=sqlite://
- BLINKER=0 DATABASE_URI=sqlite://
branches:
only:
- master
before_script:
- if [[ $DATABASE_URI == postgresql* ]]; then psql -c 'create database travis_ci_test;' -U postgres; fi
install:
- travis_retry pip install -U pytest
- travis_retry pip install -r requirements.txt
- travis_retry pip install -r dev-requirements.txt
- if [[ $BLINKER == 1 ]]; then travis_retry pip install blinker; fi
- if [[ $DATABASE_URI == postgresql* ]]; then travis_retry pip install psycopg2; fi
- travis_retry pip install codecov
- pip install -e .
script:
- py.test --cov=flask_dance
after_success:
- codecov
|
c290f2670341153d052c2a47cfd34d34a14bc90e | sli/acceptance-tests/test/features/cross_app_tests/rc_sandbox_delete_tenant.feature | sli/acceptance-tests/test/features/cross_app_tests/rc_sandbox_delete_tenant.feature | @RALLY_US4835
@rc
@sandbox
Feature: Delete tenant and drop tenant database
Background: Make a connection to Mongo
Given I have a connection to Mongo
And I am running in Sandbox mode
Scenario: Delete tenant from sli.tenant and drop tenant database
When I get the database name
Then I will drop the whole database
And I will drop the tenant document from the collection
And I will delete the applications "Schlemiel,NotTheAppYoureLookingFor" from the collection
| @RALLY_US4835
@rc
@sandbox
Feature: Delete tenant and drop tenant database
Background: Make a connection to Mongo
Given I have a connection to Mongo
And I am running in Sandbox mode
Scenario: Delete tenant from sli.tenant and drop tenant database
When I get the database name
And I clean my tenant's landing zone
Then I will drop the whole database
And I will drop the tenant document from the collection
And I will delete the applications "Schlemiel,NotTheAppYoureLookingFor" from the collection | Revert "clean up step not needed" | Revert "clean up step not needed"
This reverts commit 3cbea160f6dfb415b0a8364b1e53e16a643a72e2.
| Cucumber | apache-2.0 | inbloom/secure-data-service,inbloom/secure-data-service,inbloom/secure-data-service,inbloom/secure-data-service,inbloom/secure-data-service | cucumber | ## Code Before:
@RALLY_US4835
@rc
@sandbox
Feature: Delete tenant and drop tenant database
Background: Make a connection to Mongo
Given I have a connection to Mongo
And I am running in Sandbox mode
Scenario: Delete tenant from sli.tenant and drop tenant database
When I get the database name
Then I will drop the whole database
And I will drop the tenant document from the collection
And I will delete the applications "Schlemiel,NotTheAppYoureLookingFor" from the collection
## Instruction:
Revert "clean up step not needed"
This reverts commit 3cbea160f6dfb415b0a8364b1e53e16a643a72e2.
## Code After:
@RALLY_US4835
@rc
@sandbox
Feature: Delete tenant and drop tenant database
Background: Make a connection to Mongo
Given I have a connection to Mongo
And I am running in Sandbox mode
Scenario: Delete tenant from sli.tenant and drop tenant database
When I get the database name
And I clean my tenant's landing zone
Then I will drop the whole database
And I will drop the tenant document from the collection
And I will delete the applications "Schlemiel,NotTheAppYoureLookingFor" from the collection |
89132dce0f4341c0dd2c6bdbdd02a89681162acd | src/com/redhat/ceylon/common/tool/ToolMessages.java | src/com/redhat/ceylon/common/tool/ToolMessages.java | /*
* Copyright Red Hat Inc. and/or its affiliates and other contributors
* as indicated by the authors tag. All rights reserved.
*
* This copyrighted material is made available to anyone wishing to use,
* modify, copy, or redistribute it subject to the terms and conditions
* of the GNU General Public License version 2.
*
* This particular file is subject to the "Classpath" exception as provided in the
* LICENSE file that accompanied this code.
*
* This program is distributed in the hope that it will be useful, but WITHOUT A
* WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A
* PARTICULAR PURPOSE. See the GNU General Public License for more details.
* You should have received a copy of the GNU General Public License,
* along with this distribution; if not, write to the Free Software
* Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston,
* MA 02110-1301, USA.
*/
package com.redhat.ceylon.common.tool;
import java.text.MessageFormat;
import java.util.ResourceBundle;
class ToolMessages {
private static final ResourceBundle RESOURCE_BUNDLE = ResourceBundle.getBundle("com.redhat.ceylon.common.tool.tools");
public static String msg(String msgKey, Object... msgArgs) {
String msg = RESOURCE_BUNDLE.getString(msgKey);
if (msgArgs != null) {
msg = MessageFormat.format(msg, msgArgs);
}
return msg;
}
} | /*
* Copyright Red Hat Inc. and/or its affiliates and other contributors
* as indicated by the authors tag. All rights reserved.
*
* This copyrighted material is made available to anyone wishing to use,
* modify, copy, or redistribute it subject to the terms and conditions
* of the GNU General Public License version 2.
*
* This particular file is subject to the "Classpath" exception as provided in the
* LICENSE file that accompanied this code.
*
* This program is distributed in the hope that it will be useful, but WITHOUT A
* WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A
* PARTICULAR PURPOSE. See the GNU General Public License for more details.
* You should have received a copy of the GNU General Public License,
* along with this distribution; if not, write to the Free Software
* Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston,
* MA 02110-1301, USA.
*/
package com.redhat.ceylon.common.tool;
import java.util.ResourceBundle;
import com.redhat.ceylon.common.Messages;
class ToolMessages extends Messages {
private static final ResourceBundle RESOURCE_BUNDLE = ResourceBundle.getBundle("com.redhat.ceylon.common.tool.tools");
public static String msg(String msgKey, Object... msgArgs) {
return msg(RESOURCE_BUNDLE, msgKey, msgArgs);
}
} | Extend the common Messages class | Extend the common Messages class
| Java | apache-2.0 | ceylon/ceylon-common,jvasileff/ceylon-common,ceylon/ceylon-common,ceylon/ceylon-common,jvasileff/ceylon-common,jvasileff/ceylon-common | java | ## Code Before:
/*
* Copyright Red Hat Inc. and/or its affiliates and other contributors
* as indicated by the authors tag. All rights reserved.
*
* This copyrighted material is made available to anyone wishing to use,
* modify, copy, or redistribute it subject to the terms and conditions
* of the GNU General Public License version 2.
*
* This particular file is subject to the "Classpath" exception as provided in the
* LICENSE file that accompanied this code.
*
* This program is distributed in the hope that it will be useful, but WITHOUT A
* WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A
* PARTICULAR PURPOSE. See the GNU General Public License for more details.
* You should have received a copy of the GNU General Public License,
* along with this distribution; if not, write to the Free Software
* Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston,
* MA 02110-1301, USA.
*/
package com.redhat.ceylon.common.tool;
import java.text.MessageFormat;
import java.util.ResourceBundle;
class ToolMessages {
private static final ResourceBundle RESOURCE_BUNDLE = ResourceBundle.getBundle("com.redhat.ceylon.common.tool.tools");
public static String msg(String msgKey, Object... msgArgs) {
String msg = RESOURCE_BUNDLE.getString(msgKey);
if (msgArgs != null) {
msg = MessageFormat.format(msg, msgArgs);
}
return msg;
}
}
## Instruction:
Extend the common Messages class
## Code After:
/*
* Copyright Red Hat Inc. and/or its affiliates and other contributors
* as indicated by the authors tag. All rights reserved.
*
* This copyrighted material is made available to anyone wishing to use,
* modify, copy, or redistribute it subject to the terms and conditions
* of the GNU General Public License version 2.
*
* This particular file is subject to the "Classpath" exception as provided in the
* LICENSE file that accompanied this code.
*
* This program is distributed in the hope that it will be useful, but WITHOUT A
* WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A
* PARTICULAR PURPOSE. See the GNU General Public License for more details.
* You should have received a copy of the GNU General Public License,
* along with this distribution; if not, write to the Free Software
* Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston,
* MA 02110-1301, USA.
*/
package com.redhat.ceylon.common.tool;
import java.util.ResourceBundle;
import com.redhat.ceylon.common.Messages;
class ToolMessages extends Messages {
private static final ResourceBundle RESOURCE_BUNDLE = ResourceBundle.getBundle("com.redhat.ceylon.common.tool.tools");
public static String msg(String msgKey, Object... msgArgs) {
return msg(RESOURCE_BUNDLE, msgKey, msgArgs);
}
} |
5118c78a3a89424e0577d538b2f85b1753284344 | setup.py | setup.py | import os
import re
from setuptools import setup, find_packages
def get_version():
with open(os.path.join(os.path.dirname(__file__), 'gitlab_mr.py')) as f:
for line in f.readlines():
m = re.match(r"__version__ = '(.*?)'", line)
if m:
return m.group(1)
raise ValueError('Cannot find version')
def parse_requirements(req_file_path):
with open(req_file_path) as f:
return f.readlines()
setup(
name="gitlab-merge-request",
version=get_version(),
author="Alexander Koval",
author_email="kovalidis@gmail.com",
description=("Console utility to create gitlab merge requests."),
license="Apache License 2.0",
keywords="git gitlab merge-request",
url="https://github.com/anti-social/gitlab-merge-request",
py_modules=[
'gitlab_mr',
],
entry_points = {
'console_scripts': ['gitlab-mr = gitlab_mr:main'],
},
install_requires=parse_requirements('requirements.txt'),
classifiers=[
"Development Status :: 3 - Alpha",
"Intended Audience :: Developers",
"License :: OSI Approved :: Apache Software License",
"Operating System :: OS Independent",
"Programming Language :: Python",
"Environment :: Console",
"Topic :: Software Development :: Version Control",
"Topic :: Utilities",
],
)
| import os
import re
import sys
from setuptools import setup, find_packages
PY_VER = sys.version_info
if PY_VER >= (3, 4):
pass
else:
raise RuntimeError("Only support Python version >= 3.4")
def get_version():
with open(os.path.join(os.path.dirname(__file__), 'gitlab_mr.py')) as f:
for line in f.readlines():
m = re.match(r"__version__ = '(.*?)'", line)
if m:
return m.group(1)
raise ValueError('Cannot find version')
def parse_requirements(req_file_path):
with open(req_file_path) as f:
return f.readlines()
setup(
name="gitlab-merge-request",
version=get_version(),
author="Alexander Koval",
author_email="kovalidis@gmail.com",
description=("Console utility to create gitlab merge requests."),
license="Apache License 2.0",
keywords="git gitlab merge-request",
url="https://github.com/anti-social/gitlab-merge-request",
py_modules=[
'gitlab_mr',
],
entry_points = {
'console_scripts': ['gitlab-mr = gitlab_mr:main'],
},
install_requires=parse_requirements('requirements.txt'),
classifiers=[
"Development Status :: 3 - Alpha",
"Intended Audience :: Developers",
"License :: OSI Approved :: Apache Software License",
"Operating System :: OS Independent",
"Programming Language :: Python",
"Environment :: Console",
"Topic :: Software Development :: Version Control",
"Topic :: Utilities",
],
)
| Support only python >= 3.4 | Support only python >= 3.4
| Python | apache-2.0 | anti-social/gitlab-merge-request | python | ## Code Before:
import os
import re
from setuptools import setup, find_packages
def get_version():
with open(os.path.join(os.path.dirname(__file__), 'gitlab_mr.py')) as f:
for line in f.readlines():
m = re.match(r"__version__ = '(.*?)'", line)
if m:
return m.group(1)
raise ValueError('Cannot find version')
def parse_requirements(req_file_path):
with open(req_file_path) as f:
return f.readlines()
setup(
name="gitlab-merge-request",
version=get_version(),
author="Alexander Koval",
author_email="kovalidis@gmail.com",
description=("Console utility to create gitlab merge requests."),
license="Apache License 2.0",
keywords="git gitlab merge-request",
url="https://github.com/anti-social/gitlab-merge-request",
py_modules=[
'gitlab_mr',
],
entry_points = {
'console_scripts': ['gitlab-mr = gitlab_mr:main'],
},
install_requires=parse_requirements('requirements.txt'),
classifiers=[
"Development Status :: 3 - Alpha",
"Intended Audience :: Developers",
"License :: OSI Approved :: Apache Software License",
"Operating System :: OS Independent",
"Programming Language :: Python",
"Environment :: Console",
"Topic :: Software Development :: Version Control",
"Topic :: Utilities",
],
)
## Instruction:
Support only python >= 3.4
## Code After:
import os
import re
import sys
from setuptools import setup, find_packages
PY_VER = sys.version_info
if PY_VER >= (3, 4):
pass
else:
raise RuntimeError("Only support Python version >= 3.4")
def get_version():
with open(os.path.join(os.path.dirname(__file__), 'gitlab_mr.py')) as f:
for line in f.readlines():
m = re.match(r"__version__ = '(.*?)'", line)
if m:
return m.group(1)
raise ValueError('Cannot find version')
def parse_requirements(req_file_path):
with open(req_file_path) as f:
return f.readlines()
setup(
name="gitlab-merge-request",
version=get_version(),
author="Alexander Koval",
author_email="kovalidis@gmail.com",
description=("Console utility to create gitlab merge requests."),
license="Apache License 2.0",
keywords="git gitlab merge-request",
url="https://github.com/anti-social/gitlab-merge-request",
py_modules=[
'gitlab_mr',
],
entry_points = {
'console_scripts': ['gitlab-mr = gitlab_mr:main'],
},
install_requires=parse_requirements('requirements.txt'),
classifiers=[
"Development Status :: 3 - Alpha",
"Intended Audience :: Developers",
"License :: OSI Approved :: Apache Software License",
"Operating System :: OS Independent",
"Programming Language :: Python",
"Environment :: Console",
"Topic :: Software Development :: Version Control",
"Topic :: Utilities",
],
)
|
852feabb469e8ae6d708fbe66a815e8e86f0f36f | lib/mongo_mapper/plugins/touch.rb | lib/mongo_mapper/plugins/touch.rb | module MongoMapper
module Plugins
module Touch
extend ActiveSupport::Concern
def touch(key = :updated_at)
raise "InvalidKey" unless self.key_names.include?(key.to_s)
self.set(key => Time.now.utc)
true
end
end
end
end | module MongoMapper
module Plugins
module Touch
extend ActiveSupport::Concern
def touch(key = :updated_at)
raise ArgumentError, "Invalid key named #{key}" unless self.key_names.include?(key.to_s)
self.set(key => Time.now.utc)
true
end
end
end
end | Raise an ArgumentError with a better message | Raise an ArgumentError with a better message
| Ruby | mit | dhemmat/mongomapper,kachick/mongomapper,mongomapper/mongomapper,Yesware/mongomapper | ruby | ## Code Before:
module MongoMapper
module Plugins
module Touch
extend ActiveSupport::Concern
def touch(key = :updated_at)
raise "InvalidKey" unless self.key_names.include?(key.to_s)
self.set(key => Time.now.utc)
true
end
end
end
end
## Instruction:
Raise an ArgumentError with a better message
## Code After:
module MongoMapper
module Plugins
module Touch
extend ActiveSupport::Concern
def touch(key = :updated_at)
raise ArgumentError, "Invalid key named #{key}" unless self.key_names.include?(key.to_s)
self.set(key => Time.now.utc)
true
end
end
end
end |
65c5474936dca27023e45c1644fa2a9492e9a420 | tests/convergence_tests/run_convergence_tests_lspr.py | tests/convergence_tests/run_convergence_tests_lspr.py | import os
import time
import subprocess
import datetime
from check_for_meshes import check_mesh
# tests to run
tests = ['sphere_lspr.py', 'sphere_multiple_lspr.py']
# specify CUDA device to use
CUDA_DEVICE = '0'
ENV = os.environ.copy()
ENV['CUDA_DEVICE'] = CUDA_DEVICE
mesh_file = ''
folder_name = 'lspr_convergence_test_meshes'
rename_folder = 'geometry_lspr'
size = '~3MB'
check_mesh(mesh_file, folder_name, rename_folder, size)
tic = time.time()
for test in tests:
subprocess.call(['python', '{}'.format(test)])
toc = time.time()
print("Total runtime for convergence tests: ")
print(str(datetime.timedelta(seconds=(toc - tic))))
| import os
import time
import subprocess
import datetime
from check_for_meshes import check_mesh
# tests to run
tests = ['sphere_lspr.py', 'sphere_multiple_lspr.py']
# specify CUDA device to use
CUDA_DEVICE = '0'
ENV = os.environ.copy()
ENV['CUDA_DEVICE'] = CUDA_DEVICE
mesh_file = 'https://zenodo.org/record/580786/files/pygbe-lspr_convergence_test_meshes.zip'
folder_name = 'lspr_convergence_test_meshes'
rename_folder = 'geometry_lspr'
size = '~3MB'
check_mesh(mesh_file, folder_name, rename_folder, size)
tic = time.time()
for test in tests:
subprocess.call(['python', '{}'.format(test)])
toc = time.time()
print("Total runtime for convergence tests: ")
print(str(datetime.timedelta(seconds=(toc - tic))))
| Add path to convergence test lspr zip file | Add path to convergence test lspr zip file
| Python | bsd-3-clause | barbagroup/pygbe,barbagroup/pygbe,barbagroup/pygbe | python | ## Code Before:
import os
import time
import subprocess
import datetime
from check_for_meshes import check_mesh
# tests to run
tests = ['sphere_lspr.py', 'sphere_multiple_lspr.py']
# specify CUDA device to use
CUDA_DEVICE = '0'
ENV = os.environ.copy()
ENV['CUDA_DEVICE'] = CUDA_DEVICE
mesh_file = ''
folder_name = 'lspr_convergence_test_meshes'
rename_folder = 'geometry_lspr'
size = '~3MB'
check_mesh(mesh_file, folder_name, rename_folder, size)
tic = time.time()
for test in tests:
subprocess.call(['python', '{}'.format(test)])
toc = time.time()
print("Total runtime for convergence tests: ")
print(str(datetime.timedelta(seconds=(toc - tic))))
## Instruction:
Add path to convergence test lspr zip file
## Code After:
import os
import time
import subprocess
import datetime
from check_for_meshes import check_mesh
# tests to run
tests = ['sphere_lspr.py', 'sphere_multiple_lspr.py']
# specify CUDA device to use
CUDA_DEVICE = '0'
ENV = os.environ.copy()
ENV['CUDA_DEVICE'] = CUDA_DEVICE
mesh_file = 'https://zenodo.org/record/580786/files/pygbe-lspr_convergence_test_meshes.zip'
folder_name = 'lspr_convergence_test_meshes'
rename_folder = 'geometry_lspr'
size = '~3MB'
check_mesh(mesh_file, folder_name, rename_folder, size)
tic = time.time()
for test in tests:
subprocess.call(['python', '{}'.format(test)])
toc = time.time()
print("Total runtime for convergence tests: ")
print(str(datetime.timedelta(seconds=(toc - tic))))
|
2612e72bb416327f120a0c27168f28e31902b444 | src/GToolkit-Constraints/GtGtoolkitArchitecturalReport.class.st | src/GToolkit-Constraints/GtGtoolkitArchitecturalReport.class.st | "
!Architectural report for Glamorous Toolkit
"
Class {
#name : #GtGtoolkitArchitecturalReport,
#superclass : #GtConstrainerReport,
#category : #'GToolkit-Constraints'
}
{ #category : #building }
GtGtoolkitArchitecturalReport >> build: aComposite [
aComposite name: 'GToolkit Architectural Report'.
aComposite
addConstraint: GtClassWithCommentsContainingMissingReferences new;
addConstraint: GtReturnPragmasShouldPointToExistingClasses new;
addConstraint: GtRBAcceptVisitorCalledFromNonVisitingMethods new;
addConstraint: GtButtonsDefiningActionsThroughModels new;
addConstraint: GtLooksSubscribingToEventsNotFromInitializeListenerMethods new;
addConstraint: GtBlocEventsShouldHaveDispatchMethodSimilarToClass new;
addConstraint: GtSubscriptionsShouldNotUseDeliveryConditionWhere new;
addConstraint: GtTraitsShouldNotBeNested new;
addConstraint: GtWeakSubscriptionsWithBlockSubscribers new;
addConstraint: GtBaselinesShouldProperlySpecifyDependencies new
]
| "
Architectural report for Glamorous Toolkit.
"
Class {
#name : #GtGtoolkitArchitecturalReport,
#superclass : #GtConstrainerReport,
#category : #'GToolkit-Constraints'
}
{ #category : #accessing }
GtGtoolkitArchitecturalReport class >> yourself [
<gtExample>
^ self new
]
{ #category : #building }
GtGtoolkitArchitecturalReport >> build: aComposite [
aComposite name: 'GToolkit Architectural Report'.
aComposite
addConstraint: GtClassWithCommentsContainingMissingReferences new;
addConstraint: GtReturnPragmasShouldPointToExistingClasses new;
addConstraint: GtRBAcceptVisitorCalledFromNonVisitingMethods new;
addConstraint: GtButtonsDefiningActionsThroughModels new;
addConstraint: GtLooksSubscribingToEventsNotFromInitializeListenerMethods new;
addConstraint: GtBlocEventsShouldHaveDispatchMethodSimilarToClass new;
addConstraint: GtSubscriptionsShouldNotUseDeliveryConditionWhere new;
addConstraint: GtTraitsShouldNotBeNested new;
addConstraint: GtWeakSubscriptionsWithBlockSubscribers new;
addConstraint: GtBaselinesShouldProperlySpecifyDependencies new
]
| Add yourself example to GtGtoolkitArchitecturalReport | Add yourself example to GtGtoolkitArchitecturalReport | Smalltalk | mit | feenkcom/gtoolkit | smalltalk | ## Code Before:
"
!Architectural report for Glamorous Toolkit
"
Class {
#name : #GtGtoolkitArchitecturalReport,
#superclass : #GtConstrainerReport,
#category : #'GToolkit-Constraints'
}
{ #category : #building }
GtGtoolkitArchitecturalReport >> build: aComposite [
aComposite name: 'GToolkit Architectural Report'.
aComposite
addConstraint: GtClassWithCommentsContainingMissingReferences new;
addConstraint: GtReturnPragmasShouldPointToExistingClasses new;
addConstraint: GtRBAcceptVisitorCalledFromNonVisitingMethods new;
addConstraint: GtButtonsDefiningActionsThroughModels new;
addConstraint: GtLooksSubscribingToEventsNotFromInitializeListenerMethods new;
addConstraint: GtBlocEventsShouldHaveDispatchMethodSimilarToClass new;
addConstraint: GtSubscriptionsShouldNotUseDeliveryConditionWhere new;
addConstraint: GtTraitsShouldNotBeNested new;
addConstraint: GtWeakSubscriptionsWithBlockSubscribers new;
addConstraint: GtBaselinesShouldProperlySpecifyDependencies new
]
## Instruction:
Add yourself example to GtGtoolkitArchitecturalReport
## Code After:
"
Architectural report for Glamorous Toolkit.
"
Class {
#name : #GtGtoolkitArchitecturalReport,
#superclass : #GtConstrainerReport,
#category : #'GToolkit-Constraints'
}
{ #category : #accessing }
GtGtoolkitArchitecturalReport class >> yourself [
<gtExample>
^ self new
]
{ #category : #building }
GtGtoolkitArchitecturalReport >> build: aComposite [
aComposite name: 'GToolkit Architectural Report'.
aComposite
addConstraint: GtClassWithCommentsContainingMissingReferences new;
addConstraint: GtReturnPragmasShouldPointToExistingClasses new;
addConstraint: GtRBAcceptVisitorCalledFromNonVisitingMethods new;
addConstraint: GtButtonsDefiningActionsThroughModels new;
addConstraint: GtLooksSubscribingToEventsNotFromInitializeListenerMethods new;
addConstraint: GtBlocEventsShouldHaveDispatchMethodSimilarToClass new;
addConstraint: GtSubscriptionsShouldNotUseDeliveryConditionWhere new;
addConstraint: GtTraitsShouldNotBeNested new;
addConstraint: GtWeakSubscriptionsWithBlockSubscribers new;
addConstraint: GtBaselinesShouldProperlySpecifyDependencies new
]
|
27a7e589ec3f5b29d99cede4af66780509ab6973 | foursquare/tests/test_photos.py | foursquare/tests/test_photos.py | import logging; log = logging.getLogger(__name__)
from . import BaseAuthenticatedEndpointTestCase, BaseUserlessEndpointTestCase
import os
TEST_DATA_DIR = os.path.join(os.path.dirname(__file__), 'testdata')
class PhotosEndpointTestCase(BaseAuthenticatedEndpointTestCase):
"""
General
"""
def test_photo(self):
response = self.api.photos(self.default_photoid)
assert 'photo' in response
def test_attach_photo(self):
"""Creates a checkin and attaches a photo to it."""
response = self.api.checkins.add(params={'venueId': self.default_venueid})
checkin = response.get('checkin')
self.assertIsNotNone(checkin)
photo_data = open(os.path.join(TEST_DATA_DIR, 'test-photo.jpg'), 'rb')
try:
response = self.api.photos.add(params={'checkinId': checkin['id']},
photo_data=photo_data)
photo = response.get('photo')
self.assertIsNotNone(photo)
self.assertEquals(300, photo['width'])
self.assertEquals(300, photo['height'])
finally:
photo_data.close()
| import logging; log = logging.getLogger(__name__)
from . import BaseAuthenticatedEndpointTestCase, BaseUserlessEndpointTestCase
import os
TEST_DATA_DIR = os.path.join(os.path.dirname(__file__), 'testdata')
class PhotosEndpointTestCase(BaseAuthenticatedEndpointTestCase):
"""
General
"""
def test_photo(self):
response = self.api.photos(self.default_photoid)
assert 'photo' in response
def test_attach_photo(self):
"""Creates a checkin and attaches a photo to it."""
response = self.api.checkins.add(params={'venueId': self.default_venueid})
checkin = response.get('checkin')
self.assertNotEqual(checkin, None)
photo_data = open(os.path.join(TEST_DATA_DIR, 'test-photo.jpg'), 'rb')
try:
response = self.api.photos.add(params={'checkinId': checkin['id']},
photo_data=photo_data)
photo = response.get('photo')
self.assertNotEqual(photo, None)
self.assertEquals(300, photo['width'])
self.assertEquals(300, photo['height'])
finally:
photo_data.close()
| Make test compatible with Python 2.6. | Make test compatible with Python 2.6.
| Python | mit | mLewisLogic/foursquare,mLewisLogic/foursquare | python | ## Code Before:
import logging; log = logging.getLogger(__name__)
from . import BaseAuthenticatedEndpointTestCase, BaseUserlessEndpointTestCase
import os
TEST_DATA_DIR = os.path.join(os.path.dirname(__file__), 'testdata')
class PhotosEndpointTestCase(BaseAuthenticatedEndpointTestCase):
"""
General
"""
def test_photo(self):
response = self.api.photos(self.default_photoid)
assert 'photo' in response
def test_attach_photo(self):
"""Creates a checkin and attaches a photo to it."""
response = self.api.checkins.add(params={'venueId': self.default_venueid})
checkin = response.get('checkin')
self.assertIsNotNone(checkin)
photo_data = open(os.path.join(TEST_DATA_DIR, 'test-photo.jpg'), 'rb')
try:
response = self.api.photos.add(params={'checkinId': checkin['id']},
photo_data=photo_data)
photo = response.get('photo')
self.assertIsNotNone(photo)
self.assertEquals(300, photo['width'])
self.assertEquals(300, photo['height'])
finally:
photo_data.close()
## Instruction:
Make test compatible with Python 2.6.
## Code After:
import logging; log = logging.getLogger(__name__)
from . import BaseAuthenticatedEndpointTestCase, BaseUserlessEndpointTestCase
import os
TEST_DATA_DIR = os.path.join(os.path.dirname(__file__), 'testdata')
class PhotosEndpointTestCase(BaseAuthenticatedEndpointTestCase):
"""
General
"""
def test_photo(self):
response = self.api.photos(self.default_photoid)
assert 'photo' in response
def test_attach_photo(self):
"""Creates a checkin and attaches a photo to it."""
response = self.api.checkins.add(params={'venueId': self.default_venueid})
checkin = response.get('checkin')
self.assertNotEqual(checkin, None)
photo_data = open(os.path.join(TEST_DATA_DIR, 'test-photo.jpg'), 'rb')
try:
response = self.api.photos.add(params={'checkinId': checkin['id']},
photo_data=photo_data)
photo = response.get('photo')
self.assertNotEqual(photo, None)
self.assertEquals(300, photo['width'])
self.assertEquals(300, photo['height'])
finally:
photo_data.close()
|
9ff1c42b4b106fdf0484cd2522151bb6507353fe | allauth/templates/account/password_reset_from_key.html | allauth/templates/account/password_reset_from_key.html | {% extends "account/base.html" %}
{% load i18n %}
{% block head_title %}{% trans "Change Password" %}{% endblock %}
{% block content %}
<h1>{% if token_fail %}{% trans "Bad Token" %}{% else %}{% trans "Change Password" %}{% endif %}</h1>
{% if token_fail %}
{% url 'account_reset_password' as passwd_reset_url %}
<p>{% blocktrans %}The password reset link was invalid, possibly because it has already been used. Please request a <a href="{{ passwd_reset_url }}">new password reset</a>.{% endblocktrans %}</p>
{% else %}
{% if form %}
<form method="POST" action=".">
{% csrf_token %}
{{ form.as_p }}
<input type="submit" name="action" value="{% trans 'change password' %}"/>
</form>
{% else %}
<p>{% trans 'Your password is now changed.' %}</p>
{% endif %}
{% endif %}
{% endblock %}
| {% extends "account/base.html" %}
{% load i18n %}
{% block head_title %}{% trans "Change Password" %}{% endblock %}
{% block content %}
<h1>{% if token_fail %}{% trans "Bad Token" %}{% else %}{% trans "Change Password" %}{% endif %}</h1>
{% if token_fail %}
{% url 'account_reset_password' as passwd_reset_url %}
<p>{% blocktrans %}The password reset link was invalid, possibly because it has already been used. Please request a <a href="{{ passwd_reset_url }}">new password reset</a>.{% endblocktrans %}</p>
{% else %}
{% if form %}
<form method="POST" action="{% url 'account_reset_password_from_key' %}">
{% csrf_token %}
{{ form.as_p }}
<input type="submit" name="action" value="{% trans 'change password' %}"/>
</form>
{% else %}
<p>{% trans 'Your password is now changed.' %}</p>
{% endif %}
{% endif %}
{% endblock %}
| Put in real action URL instead of '.' | Put in real action URL instead of '.'
Closes #1437
| HTML | mit | bittner/django-allauth,pztrick/django-allauth,spool/django-allauth,lukeburden/django-allauth,rsalmaso/django-allauth,pztrick/django-allauth,AltSchool/django-allauth,lukeburden/django-allauth,pennersr/django-allauth,joshowen/django-allauth,joshowen/django-allauth,joshowen/django-allauth,AltSchool/django-allauth,rsalmaso/django-allauth,AltSchool/django-allauth,bittner/django-allauth,pennersr/django-allauth,lukeburden/django-allauth,pztrick/django-allauth,bittner/django-allauth,spool/django-allauth,spool/django-allauth,rsalmaso/django-allauth,pennersr/django-allauth | html | ## Code Before:
{% extends "account/base.html" %}
{% load i18n %}
{% block head_title %}{% trans "Change Password" %}{% endblock %}
{% block content %}
<h1>{% if token_fail %}{% trans "Bad Token" %}{% else %}{% trans "Change Password" %}{% endif %}</h1>
{% if token_fail %}
{% url 'account_reset_password' as passwd_reset_url %}
<p>{% blocktrans %}The password reset link was invalid, possibly because it has already been used. Please request a <a href="{{ passwd_reset_url }}">new password reset</a>.{% endblocktrans %}</p>
{% else %}
{% if form %}
<form method="POST" action=".">
{% csrf_token %}
{{ form.as_p }}
<input type="submit" name="action" value="{% trans 'change password' %}"/>
</form>
{% else %}
<p>{% trans 'Your password is now changed.' %}</p>
{% endif %}
{% endif %}
{% endblock %}
## Instruction:
Put in real action URL instead of '.'
Closes #1437
## Code After:
{% extends "account/base.html" %}
{% load i18n %}
{% block head_title %}{% trans "Change Password" %}{% endblock %}
{% block content %}
<h1>{% if token_fail %}{% trans "Bad Token" %}{% else %}{% trans "Change Password" %}{% endif %}</h1>
{% if token_fail %}
{% url 'account_reset_password' as passwd_reset_url %}
<p>{% blocktrans %}The password reset link was invalid, possibly because it has already been used. Please request a <a href="{{ passwd_reset_url }}">new password reset</a>.{% endblocktrans %}</p>
{% else %}
{% if form %}
<form method="POST" action="{% url 'account_reset_password_from_key' %}">
{% csrf_token %}
{{ form.as_p }}
<input type="submit" name="action" value="{% trans 'change password' %}"/>
</form>
{% else %}
<p>{% trans 'Your password is now changed.' %}</p>
{% endif %}
{% endif %}
{% endblock %}
|
58dce022ddda69f2ea347f1f5b324204a2d20975 | app/scripts/app/routes/application.js | app/scripts/app/routes/application.js | var ApplicationRoute = Em.Route.extend({
model: function () {
// bootstrap initialized data if it exists
if (Em.isArray(window.bootstrap)) {
this.store.pushMany('person', window.bootstrap);
// signal to IndexController that data exists
// and fetching from the server is not necessary
// IndexRoute will remove this property after its first render.
return Em.Object.create({
bootstrapped: true,
});
}
// signal to IndexController that data does not yet exist
// and fetching from the server is necessary.
// IndexRoute will remove this property after its first render.
return Em.Object.create({
bootstrapped: false
});
},
});
export default ApplicationRoute;
| var ApplicationRoute = Em.Route.extend();
export default ApplicationRoute;
| Remove outdated bootstrap code in ApplicationRoute | Remove outdated bootstrap code in ApplicationRoute
| JavaScript | mit | darvelo/wishlist,darvelo/wishlist | javascript | ## Code Before:
var ApplicationRoute = Em.Route.extend({
model: function () {
// bootstrap initialized data if it exists
if (Em.isArray(window.bootstrap)) {
this.store.pushMany('person', window.bootstrap);
// signal to IndexController that data exists
// and fetching from the server is not necessary
// IndexRoute will remove this property after its first render.
return Em.Object.create({
bootstrapped: true,
});
}
// signal to IndexController that data does not yet exist
// and fetching from the server is necessary.
// IndexRoute will remove this property after its first render.
return Em.Object.create({
bootstrapped: false
});
},
});
export default ApplicationRoute;
## Instruction:
Remove outdated bootstrap code in ApplicationRoute
## Code After:
var ApplicationRoute = Em.Route.extend();
export default ApplicationRoute;
|
7921c9584050268aa3028cd28fcec2b67e52ebd5 | mongo_mapper_acts_as_versioned.gemspec | mongo_mapper_acts_as_versioned.gemspec | require File.expand_path('../lib/acts_as_versioned', __FILE__)
Gem::Specification.new do |gem|
gem.name = 'mongo_mapper_acts_as_versioned'
gem.version = MongoMapper::Acts::Versioned::VERSION
gem.platform = Gem::Platform::RUBY
gem.authors = ['Gigamo']
gem.email = ['gigamo@gmail.com']
gem.homepage = 'http://github.com/gigamo/mongo_mapper_acts_as_versioned'
gem.summary = "Basic MongoMapper port of technoweenie's acts_as_versioned"
gem.description = gem.summary
gem.rubyforge_project = 'mongo_mapper_acts_as_versioned'
gem.require_paths = ['lib']
gem.files =
Dir['{lib,spec}/**/*', 'LICENSE', 'README.md'] & `git ls-files -z`.split("\0")
gem.add_dependency 'activesupport'
gem.add_development_dependency 'rspec'
gem.required_rubygems_version = '>= 1.3.6'
end
| Gem::Specification.new do |gem|
gem.name = 'mongo_mapper_acts_as_versioned'
gem.version = '0.2.0'
gem.platform = Gem::Platform::RUBY
gem.authors = ['Gigamo']
gem.email = ['gigamo@gmail.com']
gem.homepage = 'http://github.com/gigamo/mongo_mapper_acts_as_versioned'
gem.summary = "Basic MongoMapper port of technoweenie's acts_as_versioned"
gem.description = gem.summary
gem.rubyforge_project = 'mongo_mapper_acts_as_versioned'
gem.require_paths = ['lib']
gem.files =
Dir['{lib,spec}/**/*', 'LICENSE', 'README.md'] & `git ls-files -z`.split("\0")
gem.add_dependency 'activesupport'
gem.add_development_dependency 'rspec'
gem.required_rubygems_version = '>= 1.3.6'
end
| Stop requiring the library in gemspec | Stop requiring the library in gemspec | Ruby | mit | gigamo/mongo_mapper_acts_as_versioned | ruby | ## Code Before:
require File.expand_path('../lib/acts_as_versioned', __FILE__)
Gem::Specification.new do |gem|
gem.name = 'mongo_mapper_acts_as_versioned'
gem.version = MongoMapper::Acts::Versioned::VERSION
gem.platform = Gem::Platform::RUBY
gem.authors = ['Gigamo']
gem.email = ['gigamo@gmail.com']
gem.homepage = 'http://github.com/gigamo/mongo_mapper_acts_as_versioned'
gem.summary = "Basic MongoMapper port of technoweenie's acts_as_versioned"
gem.description = gem.summary
gem.rubyforge_project = 'mongo_mapper_acts_as_versioned'
gem.require_paths = ['lib']
gem.files =
Dir['{lib,spec}/**/*', 'LICENSE', 'README.md'] & `git ls-files -z`.split("\0")
gem.add_dependency 'activesupport'
gem.add_development_dependency 'rspec'
gem.required_rubygems_version = '>= 1.3.6'
end
## Instruction:
Stop requiring the library in gemspec
## Code After:
Gem::Specification.new do |gem|
gem.name = 'mongo_mapper_acts_as_versioned'
gem.version = '0.2.0'
gem.platform = Gem::Platform::RUBY
gem.authors = ['Gigamo']
gem.email = ['gigamo@gmail.com']
gem.homepage = 'http://github.com/gigamo/mongo_mapper_acts_as_versioned'
gem.summary = "Basic MongoMapper port of technoweenie's acts_as_versioned"
gem.description = gem.summary
gem.rubyforge_project = 'mongo_mapper_acts_as_versioned'
gem.require_paths = ['lib']
gem.files =
Dir['{lib,spec}/**/*', 'LICENSE', 'README.md'] & `git ls-files -z`.split("\0")
gem.add_dependency 'activesupport'
gem.add_development_dependency 'rspec'
gem.required_rubygems_version = '>= 1.3.6'
end
|
68e41c1f7ef79bfba22aad3392acef9aca4ca043 | sbt-release/README.md | sbt-release/README.md |
AutoPlugin that wraps [sbt-release](https://github.com/sbt/sbt-release) providing a custom versioning scheme.
### installation
Add the plugin to your project:
```
// In project/plugins.sbt
addSbtPlugin("org.allenai.plugins" % "allenai-sbt-release" % VERSION
```
Substitute `VERSION` with the latest version for the plugin on [bintray](https://bintray.com/allenai/sbt-plugins).
Enable the plugin for your project in `build.sbt`:
```
val myProject = project.in(file(".")).enablePlugins(AllenaiReleasePlugin)
```
|
AutoPlugin that wraps [sbt-release](https://github.com/sbt/sbt-release) providing a custom versioning scheme.
## Installation
Add the plugin to your project:
```
// In project/plugins.sbt
addSbtPlugin("org.allenai.plugins" % "allenai-sbt-release" % VERSION
```
Substitute `VERSION` with the latest version for the plugin on [bintray](https://bintray.com/allenai/sbt-plugins).
Enable the plugin for your **root** project in `build.sbt`:
```
val myProject = project.in(file(".")).enablePlugins(AllenaiReleasePlugin)
```
## Multi-project builds
If your project consists of subprojects, you must do the following:
- enable the `AllenaiReleasePlugin` for the root project
- stub out publishing settings for the root project
- enable the `AllenaiReleasePlugin` for all subprojects that you will release
- make sure the root project aggregates at least all subprojects that are to be released via the plugin
Here is an example multi-build project `build.sbt`:
```scala
lazy val root = project.in(file(".")).settings(
publish := { },
publishTo := Some("bogus" at "http://nowhere.com"),
publishLocal := { })
.enablePlugins(AllenaiReleasePlugin)
.aggregate(core, service)
// The core subproject will be released when you issue the release SBT command
lazy val core = project.in(file("core")).enablePlugins(AllenaiReleasePlugin)
// The service subproject will not be released because the AllenaiReleasePlugin is not enabled
lazy val service = project.in(file("service")).enablePlugins(WebServicePlugin)
```
| Improve multi-project instructions for AllenaiReleasePlugin | Improve multi-project instructions for AllenaiReleasePlugin | Markdown | apache-2.0 | ryanai3/sbt-plugins,markschaake/ai2-sbt-plugins,ryanai3/sbt-plugins,non/sbt-plugins,ryanai3/sbt-plugins,jkinkead/sbt-plugins,jkinkead/sbt-plugins,markschaake/ai2-sbt-plugins,jkinkead/sbt-plugins,markschaake/ai2-sbt-plugins,non/sbt-plugins,allenai/sbt-plugins | markdown | ## Code Before:
AutoPlugin that wraps [sbt-release](https://github.com/sbt/sbt-release) providing a custom versioning scheme.
### installation
Add the plugin to your project:
```
// In project/plugins.sbt
addSbtPlugin("org.allenai.plugins" % "allenai-sbt-release" % VERSION
```
Substitute `VERSION` with the latest version for the plugin on [bintray](https://bintray.com/allenai/sbt-plugins).
Enable the plugin for your project in `build.sbt`:
```
val myProject = project.in(file(".")).enablePlugins(AllenaiReleasePlugin)
```
## Instruction:
Improve multi-project instructions for AllenaiReleasePlugin
## Code After:
AutoPlugin that wraps [sbt-release](https://github.com/sbt/sbt-release) providing a custom versioning scheme.
## Installation
Add the plugin to your project:
```
// In project/plugins.sbt
addSbtPlugin("org.allenai.plugins" % "allenai-sbt-release" % VERSION
```
Substitute `VERSION` with the latest version for the plugin on [bintray](https://bintray.com/allenai/sbt-plugins).
Enable the plugin for your **root** project in `build.sbt`:
```
val myProject = project.in(file(".")).enablePlugins(AllenaiReleasePlugin)
```
## Multi-project builds
If your project consists of subprojects, you must do the following:
- enable the `AllenaiReleasePlugin` for the root project
- stub out publishing settings for the root project
- enable the `AllenaiReleasePlugin` for all subprojects that you will release
- make sure the root project aggregates at least all subprojects that are to be released via the plugin
Here is an example multi-build project `build.sbt`:
```scala
lazy val root = project.in(file(".")).settings(
publish := { },
publishTo := Some("bogus" at "http://nowhere.com"),
publishLocal := { })
.enablePlugins(AllenaiReleasePlugin)
.aggregate(core, service)
// The core subproject will be released when you issue the release SBT command
lazy val core = project.in(file("core")).enablePlugins(AllenaiReleasePlugin)
// The service subproject will not be released because the AllenaiReleasePlugin is not enabled
lazy val service = project.in(file("service")).enablePlugins(WebServicePlugin)
```
|
8c90cb089dd81861f31cd78fab9ab149f491c803 | sw-precache-config.json | sw-precache-config.json | {
"staticFileGlobs": [
"build",
"build/mensajes",
"build/noticias",
"build/historia",
"build/azimuth",
"build/manifest.json",
"build/assets/**/*",
"build/historia/*"
],
"stripPrefix": "build/",
"runtimeCaching": [
{
"urlPattern": "/\/elmalvinense/",
"handler": "networkFirst"
},
{
"urlPattern": "*",
"handler": "networkFirst"
}
],
"dontCacheBustUrlsMatching": "assets",
"importScripts": [
"assets/js/offline-google-analytics-import.js",
"assets/js/initga.js"
]
}
| {
"staticFileGlobs": [
"build/index.html",
"build/mensajes/",
"build/noticias",
"build/historia",
"build/azimuth",
"build/manifest.json",
"build/assets/**/*",
"build/historia/*"
],
"stripPrefix": "build",
"runtimeCaching": [
{
"urlPattern": "/\/elmalvinense/",
"handler": "networkFirst"
},
{
"urlPattern": "*",
"handler": "networkFirst"
}
],
"dontCacheBustUrlsMatching": "assets/**/*",
"importScripts": [
"assets/js/offline-google-analytics-import.js",
"assets/js/initga.js"
]
}
| Improve resource aching in service worker | Improve resource aching in service worker
| JSON | mit | MinEduTDF/app-malvinas-russell | json | ## Code Before:
{
"staticFileGlobs": [
"build",
"build/mensajes",
"build/noticias",
"build/historia",
"build/azimuth",
"build/manifest.json",
"build/assets/**/*",
"build/historia/*"
],
"stripPrefix": "build/",
"runtimeCaching": [
{
"urlPattern": "/\/elmalvinense/",
"handler": "networkFirst"
},
{
"urlPattern": "*",
"handler": "networkFirst"
}
],
"dontCacheBustUrlsMatching": "assets",
"importScripts": [
"assets/js/offline-google-analytics-import.js",
"assets/js/initga.js"
]
}
## Instruction:
Improve resource aching in service worker
## Code After:
{
"staticFileGlobs": [
"build/index.html",
"build/mensajes/",
"build/noticias",
"build/historia",
"build/azimuth",
"build/manifest.json",
"build/assets/**/*",
"build/historia/*"
],
"stripPrefix": "build",
"runtimeCaching": [
{
"urlPattern": "/\/elmalvinense/",
"handler": "networkFirst"
},
{
"urlPattern": "*",
"handler": "networkFirst"
}
],
"dontCacheBustUrlsMatching": "assets/**/*",
"importScripts": [
"assets/js/offline-google-analytics-import.js",
"assets/js/initga.js"
]
}
|