Merge pull request #4249 from PastaPastaPasta/backport-triv-pr10

backport: 'trivial' pr10
This commit is contained in:
UdjinM6 2021-07-15 04:20:41 +03:00 committed by GitHub
commit 791c7c75eb
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
17 changed files with 148 additions and 28 deletions

View File

@ -0,0 +1,21 @@
---
name: Good first issue
about: '(Regular devs only): Suggest a new good first issue'
title: ''
labels: good first issue
assignees: ''
---
#### Useful skills:
<!-- (For example, “C++11 std::thread”, “Qt5 GUI and async GUI design” or “basic understanding of Bitcoin mining and the Bitcoin Core RPC interface”.) -->
#### Want to work on this issue?
The purpose of the `good first issue` label is to highlight which issues are suitable for a new contributor without a deep understanding of the codebase.
You do not need to request permission to start working on this. You are encouraged to comment on the issue if you are planning to work on it. This will help other contributors monitor which issues are actively being addressed and is also an effective way to request assistance if and when you need it.
For guidance on contributing, please read [CONTRIBUTING.md](https://github.com/dashpay/dash/blob/master/CONTRIBUTING.md) before opening your pull request.

1
.gitignore vendored
View File

@ -33,6 +33,7 @@ build-aux/m4/ltversion.m4
build-aux/missing build-aux/missing
build-aux/compile build-aux/compile
build-aux/test-driver build-aux/test-driver
config.cache
config.log config.log
config.status config.status
configure configure

View File

@ -103,12 +103,12 @@ At this stage one should expect comments and review from other contributors. You
can add more commits to your pull request by committing them locally and pushing can add more commits to your pull request by committing them locally and pushing
to your fork until you have satisfied all feedback. to your fork until you have satisfied all feedback.
Note: Code review is a burdensome but important part of the development process, and as such, certain types of pull requests are rejected. In general, if the **improvements** do not warrant the **review effort** required, the PR has a high chance of being rejected. It is up to the PR author to convince the reviewers that the changes warrant the review effort, and if reviewers are "Concept NAK'ing" the PR, the author may need to present arguments and/or do research backing their suggested changes. Note: Code review is a burdensome but important part of the development process, and as such, certain types of pull requests are rejected. In general, if the **improvements** do not warrant the **review effort** required, the PR has a high chance of being rejected. It is up to the PR author to convince the reviewers that the changes warrant the review effort, and if reviewers are "Concept NACK'ing" the PR, the author may need to present arguments and/or do research backing their suggested changes.
Squashing Commits ### Squashing Commits
---------------------------
If your pull request is accepted for merging, you may be asked by a maintainer If your pull request contains fixup commits (commits that change the same line of code repeatedly) or too fine-grained
to squash and or [rebase](https://git-scm.com/docs/git-rebase) your commits commits, you may be asked to [squash](https://git-scm.com/docs/git-rebase#_interactive_mode) your commits
before it will be merged. The basic squashing workflow is shown below. before it will be merged. The basic squashing workflow is shown below.
git checkout your_branch_name git checkout your_branch_name
@ -135,6 +135,20 @@ the respective change set.
The length of time required for peer review is unpredictable and will vary from The length of time required for peer review is unpredictable and will vary from
pull request to pull request. pull request to pull request.
### Rebasing Changes
When a pull request conflicts with the target branch, you may be asked to rebase it on top of the current target branch.
The `git rebase` command will take care of rebuilding your commits on top of the new base.
This project aims to have a clean git history, where code changes are only made in non-merge commits. This simplifies
auditability because merge commits can be assumed to not contain arbitrary code changes. Merge commits should be signed,
and the resulting git tree hash must be deterministic and reproducible. The script in
[/contrib/verify-commits](/contrib/verify-commits) checks that.
After a rebase, reviewers are encouraged to sign off on the force push. This should be relatively straightforward with
the `git range-diff` tool explained in the [productivity
notes](/doc/productivity.md#diff-the-diffs-with-git-range-diff). To avoid needless review churn, maintainers will
generally merge pull requests that received the most review attention first.
Pull Request Philosophy Pull Request Philosophy
----------------------- -----------------------
@ -306,6 +320,31 @@ of reasons for this, some of which you can do something about:
when someone else is asking for feedback on their code, and universe balances out. when someone else is asking for feedback on their code, and universe balances out.
Backporting
-----------
Security and bug fixes can be backported from `master` to release
branches.
If the backport is non-trivial, it may be appropriate to open an
additional PR, to backport the change, only after the original PR
has been merged.
Otherwise, backports will be done in batches and
the maintainers will use the proper `Needs backport (...)` labels
when needed (the original author does not need to worry).
A backport should contain the following metadata in the commit body:
```
Github-Pull: #<PR number>
Rebased-From: <commit hash of the original commit>
```
Have a look at [an example backport PR](
https://github.com/bitcoin/bitcoin/pull/16189).
Also see the [backport.py script](
https://github.com/bitcoin-core/bitcoin-maintainer-tools#backport).
Release Policy Release Policy
-------------- --------------

View File

@ -1,5 +1,4 @@
dnl require autoconf 2.60 (AS_ECHO/AS_ECHO_N) AC_PREREQ([2.69])
AC_PREREQ([2.60])
define(_CLIENT_VERSION_MAJOR, 0) define(_CLIENT_VERSION_MAJOR, 0)
define(_CLIENT_VERSION_MINOR, 17) define(_CLIENT_VERSION_MINOR, 17)
define(_CLIENT_VERSION_REVISION, 0) define(_CLIENT_VERSION_REVISION, 0)
@ -825,6 +824,7 @@ fi
dnl this flag screws up non-darwin gcc even when the check fails. special-case it. dnl this flag screws up non-darwin gcc even when the check fails. special-case it.
if test x$TARGET_OS = xdarwin; then if test x$TARGET_OS = xdarwin; then
AX_CHECK_LINK_FLAG([[-Wl,-dead_strip]], [LDFLAGS="$LDFLAGS -Wl,-dead_strip"]) AX_CHECK_LINK_FLAG([[-Wl,-dead_strip]], [LDFLAGS="$LDFLAGS -Wl,-dead_strip"])
AX_CHECK_LINK_FLAG([[-Wl,-dead_strip_dylibs]], [LDFLAGS="$LDFLAGS -Wl,-dead_strip_dylibs"])
fi fi
AC_CHECK_HEADERS([endian.h sys/endian.h byteswap.h stdio.h stdlib.h unistd.h strings.h sys/types.h sys/stat.h sys/select.h sys/prctl.h]) AC_CHECK_HEADERS([endian.h sys/endian.h byteswap.h stdio.h stdlib.h unistd.h strings.h sys/types.h sys/stat.h sys/select.h sys/prctl.h])
@ -1423,18 +1423,16 @@ if test x$bitcoin_enable_qt != xno; then
AC_MSG_CHECKING([whether to build GUI with support for QR codes]) AC_MSG_CHECKING([whether to build GUI with support for QR codes])
if test x$have_qrencode = xno; then if test x$have_qrencode = xno; then
if test x$use_qr = xyes; then if test x$use_qr = xyes; then
AC_MSG_ERROR("QR support requested but cannot be built. use --without-qrencode") AC_MSG_ERROR([QR support requested but cannot be built. Use --without-qrencode])
fi fi
AC_MSG_RESULT(no) use_qr=no
else else
if test x$use_qr != xno; then if test x$use_qr != xno; then
AC_MSG_RESULT(yes)
AC_DEFINE([USE_QRCODE],[1],[Define if QR support should be compiled in]) AC_DEFINE([USE_QRCODE],[1],[Define if QR support should be compiled in])
use_qr=yes use_qr=yes
else
AC_MSG_RESULT(no)
fi fi
fi fi
AC_MSG_RESULT([$use_qr])
if test x$XGETTEXT = x; then if test x$XGETTEXT = x; then
AC_MSG_WARN("xgettext is required to update qt translations") AC_MSG_WARN("xgettext is required to update qt translations")

View File

@ -57,7 +57,6 @@ ALLOWED_LIBRARIES = {
'libgcc_s.so.1', # GCC base support 'libgcc_s.so.1', # GCC base support
'libc.so.6', # C library 'libc.so.6', # C library
'libpthread.so.0', # threading 'libpthread.so.0', # threading
'libanl.so.1', # DNS resolve
'libm.so.6', # math library 'libm.so.6', # math library
'librt.so.1', # real-time (clock) 'librt.so.1', # real-time (clock)
'libatomic.so.1', 'libatomic.so.1',

View File

@ -12,6 +12,11 @@ max_height=3130000
netmagic=bf0c6bbd netmagic=bf0c6bbd
input=/home/example/.dashcore/blocks input=/home/example/.dashcore/blocks
# regtest
#netmagic=fcc1b7dc
#genesis=000008ca1832a4baf228eb1553c03d3a2c8e02399550dd6ea8d65cec3ef23d2e
#input=/home/example/.dashcore/regtest/blocks
# "output" option causes blockchain files to be written to the given location, # "output" option causes blockchain files to be written to the given location,
# with "output_file" ignored. If not used, "output_file" is used instead. # with "output_file" ignored. If not used, "output_file" is used instead.
# output=/home/example/blockchain_directory # output=/home/example/blockchain_directory

View File

@ -16,6 +16,7 @@ import sys
import dash_hash import dash_hash
import datetime import datetime
import time import time
import glob
from collections import namedtuple from collections import namedtuple
from binascii import hexlify, unhexlify from binascii import hexlify, unhexlify
@ -95,6 +96,30 @@ def mkblockmap(blkindex):
blkmap[hash] = height blkmap[hash] = height
return blkmap return blkmap
# This gets the first block file ID that exists from the input block
# file directory.
def getFirstBlockFileId(block_dir_path):
# First, this sets up a pattern to search for block files, for
# example 'blkNNNNN.dat'.
blkFilePattern = os.path.join(block_dir_path, "blk[0-9][0-9][0-9][0-9][0-9].dat")
# This search is done with glob
blkFnList = glob.glob(blkFilePattern)
if len(blkFnList) == 0:
print("blocks not pruned - starting at 0")
return 0
# We then get the lexicographic minimum, which should be the first
# block file name.
firstBlkFilePath = min(blkFnList)
firstBlkFn = os.path.basename(firstBlkFilePath)
# now, the string should be ['b','l','k','N','N','N','N','N','.','d','a','t']
# So get the ID by choosing: 3 4 5 6 7
# The ID is not necessarily 0 if this is a pruned node.
blkId = int(firstBlkFn[3:8])
return blkId
# Block header and extent on disk # Block header and extent on disk
BlockExtent = namedtuple('BlockExtent', ['fn', 'offset', 'inhdr', 'blkhdr', 'size']) BlockExtent = namedtuple('BlockExtent', ['fn', 'offset', 'inhdr', 'blkhdr', 'size'])
@ -104,7 +129,9 @@ class BlockDataCopier:
self.blkindex = blkindex self.blkindex = blkindex
self.blkmap = blkmap self.blkmap = blkmap
self.inFn = 0 # Get first occurring block file id - for pruned nodes this
# will not necessarily be 0
self.inFn = getFirstBlockFileId(self.settings['input'])
self.inF = None self.inF = None
self.outFn = 0 self.outFn = 0
self.outsz = 0 self.outsz = 0

View File

@ -32,6 +32,7 @@ Developer Notes
- [GUI](#gui) - [GUI](#gui)
- [Subtrees](#subtrees) - [Subtrees](#subtrees)
- [Scripted diffs](#scripted-diffs) - [Scripted diffs](#scripted-diffs)
- [Suggestions and examples](#suggestions-and-examples)
- [Release notes](#release-notes) - [Release notes](#release-notes)
- [RPC interface guidelines](#rpc-interface-guidelines) - [RPC interface guidelines](#rpc-interface-guidelines)
@ -782,7 +783,7 @@ Scripted diffs
For reformatting and refactoring commits where the changes can be easily automated using a bash script, we use For reformatting and refactoring commits where the changes can be easily automated using a bash script, we use
scripted-diff commits. The bash script is included in the commit message and our Travis CI job checks that scripted-diff commits. The bash script is included in the commit message and our Travis CI job checks that
the result of the script is identical to the commit. This aids reviewers since they can verify that the script the result of the script is identical to the commit. This aids reviewers since they can verify that the script
does exactly what it's supposed to do. It is also helpful for rebasing (since the same script can just be re-run does exactly what it is supposed to do. It is also helpful for rebasing (since the same script can just be re-run
on the new master commit). on the new master commit).
To create a scripted-diff: To create a scripted-diff:
@ -803,7 +804,35 @@ For development, it might be more convenient to verify all scripted-diffs in a r
test/lint/commit-script-check.sh origin/master..HEAD test/lint/commit-script-check.sh origin/master..HEAD
``` ```
Commit [`bb81e173`](https://github.com/bitcoin/bitcoin/commit/bb81e173) is an example of a scripted-diff. ### Suggestions and examples
If you need to replace in multiple files, prefer `git ls-files` to `find` or globbing, and `git grep` to `grep`, to
avoid changing files that are not under version control.
For efficient replacement scripts, reduce the selection to the files that potentially need to be modified, so for
example, instead of a blanket `git ls-files src | xargs sed -i s/apple/orange/`, use
`git grep -l apple src | xargs sed -i s/apple/orange/`.
Also, it is good to keep the selection of files as specific as possible — for example, replace only in directories where
you expect replacements — because it reduces the risk that a rebase of your commit by re-running the script will
introduce accidental changes.
Some good examples of scripted-diff:
- [scripted-diff: Rename InitInterfaces to NodeContext](https://github.com/bitcoin/bitcoin/commit/301bd41a2e6765b185bd55f4c541f9e27aeea29d)
uses an elegant script to replace occurences of multiple terms in all source files.
- [scripted-diff: Remove g_connman, g_banman globals](https://github.com/bitcoin/bitcoin/commit/8922d7f6b751a3e6b3b9f6fb7961c442877fb65a)
replaces specific terms in a list of specific source files.
- [scripted-diff: Replace fprintf with tfm::format](https://github.com/bitcoin/bitcoin/commit/fac03ec43a15ad547161e37e53ea82482cc508f9)
does a global replacement but excludes certain directories.
To find all previous uses of scripted diffs in the repository, do:
```
git log --grep="-BEGIN VERIFY SCRIPT-"
```
Release notes Release notes
------------- -------------

View File

@ -41,7 +41,7 @@ threads take up 8MiB for the thread stack on a 64-bit system, and 4MiB in a
By default, since glibc `2.10`, the C library will create up to two heap arenas per core. This is known to cause excessive memory usage in some scenarios. To avoid this make a script that sets `MALLOC_ARENA_MAX` before starting dashd: By default, since glibc `2.10`, the C library will create up to two heap arenas per core. This is known to cause excessive memory usage in some scenarios. To avoid this make a script that sets `MALLOC_ARENA_MAX` before starting dashd:
```bash ```bash
#!/bin/bash #!/usr/bin/env bash
export MALLOC_ARENA_MAX=1 export MALLOC_ARENA_MAX=1
dashd dashd
``` ```

View File

@ -186,7 +186,7 @@ Codesigner only: Commit the detached codesign payloads:
rm -rf * rm -rf *
tar xf signature-osx.tar.gz tar xf signature-osx.tar.gz
tar xf signature-win.tar.gz tar xf signature-win.tar.gz
git add -a git add -A
git commit -m "point to ${VERSION}" git commit -m "point to ${VERSION}"
git tag -s v${VERSION} HEAD git tag -s v${VERSION} HEAD
git push the current branch and new tag git push the current branch and new tag

View File

@ -64,8 +64,8 @@ VIAddVersionKey ProductVersion "@PACKAGE_VERSION@"
VIAddVersionKey CompanyName "${COMPANY}" VIAddVersionKey CompanyName "${COMPANY}"
VIAddVersionKey CompanyWebsite "${URL}" VIAddVersionKey CompanyWebsite "${URL}"
VIAddVersionKey FileVersion "@PACKAGE_VERSION@" VIAddVersionKey FileVersion "@PACKAGE_VERSION@"
VIAddVersionKey FileDescription "" VIAddVersionKey FileDescription "Installer for @PACKAGE_NAME@"
VIAddVersionKey LegalCopyright "" VIAddVersionKey LegalCopyright "Copyright (C) 2009-@COPYRIGHT_YEAR@ @COPYRIGHT_HOLDERS_FINAL@"
InstallDirRegKey HKCU "${REGKEY}" Path InstallDirRegKey HKCU "${REGKEY}" Path
ShowUninstDetails show ShowUninstDetails show

View File

@ -11,7 +11,7 @@
that the following merkle tree algorithm has a serious flaw related to that the following merkle tree algorithm has a serious flaw related to
duplicate txids, resulting in a vulnerability (CVE-2012-2459). duplicate txids, resulting in a vulnerability (CVE-2012-2459).
The reason is that if the number of hashes in the list at a given time The reason is that if the number of hashes in the list at a given level
is odd, the last one is duplicated before computing the next level (which is odd, the last one is duplicated before computing the next level (which
is unusual in Merkle trees). This results in certain sequences of is unusual in Merkle trees). This results in certain sequences of
transactions leading to the same merkle root. For example, these two transactions leading to the same merkle root. For example, these two

View File

@ -732,7 +732,7 @@
<item> <item>
<widget class="QPushButton" name="buttonMinimizeFee"> <widget class="QPushButton" name="buttonMinimizeFee">
<property name="toolTip"> <property name="toolTip">
<string>collapse fee-settings</string> <string>Hide transaction fee settings</string>
</property> </property>
<property name="text"> <property name="text">
<string>Hide</string> <string>Hide</string>

View File

@ -107,6 +107,7 @@ public:
/** Generate a random integer in the range [0..range). */ /** Generate a random integer in the range [0..range). */
uint64_t randrange(uint64_t range) uint64_t randrange(uint64_t range)
{ {
assert(range);
--range; --range;
int bits = CountBits(range); int bits = CountBits(range);
while (true) { while (true) {

View File

@ -212,20 +212,21 @@ BOOST_AUTO_TEST_CASE(is)
p2sh << OP_HASH160 << ToByteVector(dummy) << OP_EQUAL; p2sh << OP_HASH160 << ToByteVector(dummy) << OP_EQUAL;
BOOST_CHECK(p2sh.IsPayToScriptHash()); BOOST_CHECK(p2sh.IsPayToScriptHash());
// Not considered pay-to-script-hash if using one of the OP_PUSHDATA opcodes:
std::vector<unsigned char> direct = {OP_HASH160, 20}; std::vector<unsigned char> direct = {OP_HASH160, 20};
direct.insert(direct.end(), 20, 0); direct.insert(direct.end(), 20, 0);
direct.push_back(OP_EQUAL); direct.push_back(OP_EQUAL);
BOOST_CHECK(CScript(direct.begin(), direct.end()).IsPayToScriptHash()); BOOST_CHECK(CScript(direct.begin(), direct.end()).IsPayToScriptHash());
// Not considered pay-to-script-hash if using one of the OP_PUSHDATA opcodes:
std::vector<unsigned char> pushdata1 = {OP_HASH160, OP_PUSHDATA1, 20}; std::vector<unsigned char> pushdata1 = {OP_HASH160, OP_PUSHDATA1, 20};
pushdata1.insert(pushdata1.end(), 20, 0); pushdata1.insert(pushdata1.end(), 20, 0);
pushdata1.push_back(OP_EQUAL); pushdata1.push_back(OP_EQUAL);
BOOST_CHECK(!CScript(pushdata1.begin(), pushdata1.end()).IsPayToScriptHash()); BOOST_CHECK(!CScript(pushdata1.begin(), pushdata1.end()).IsPayToScriptHash());
std::vector<unsigned char> pushdata2 = {OP_HASH160, 20, 0}; std::vector<unsigned char> pushdata2 = {OP_HASH160, OP_PUSHDATA2, 20, 0};
pushdata2.insert(pushdata2.end(), 20, 0); pushdata2.insert(pushdata2.end(), 20, 0);
pushdata2.push_back(OP_EQUAL); pushdata2.push_back(OP_EQUAL);
BOOST_CHECK(!CScript(pushdata2.begin(), pushdata2.end()).IsPayToScriptHash()); BOOST_CHECK(!CScript(pushdata2.begin(), pushdata2.end()).IsPayToScriptHash());
std::vector<unsigned char> pushdata4 = {OP_HASH160, 20, 0, 0, 0}; std::vector<unsigned char> pushdata4 = {OP_HASH160, OP_PUSHDATA4, 20, 0, 0, 0};
pushdata4.insert(pushdata4.end(), 20, 0); pushdata4.insert(pushdata4.end(), 20, 0);
pushdata4.push_back(OP_EQUAL); pushdata4.push_back(OP_EQUAL);
BOOST_CHECK(!CScript(pushdata4.begin(), pushdata4.end()).IsPayToScriptHash()); BOOST_CHECK(!CScript(pushdata4.begin(), pushdata4.end()).IsPayToScriptHash());

View File

@ -54,8 +54,6 @@ struct LockPoints
LockPoints() : height(0), time(0), maxInputBlock(nullptr) { } LockPoints() : height(0), time(0), maxInputBlock(nullptr) { }
}; };
class CTxMemPool;
/** \class CTxMemPoolEntry /** \class CTxMemPoolEntry
* *
* CTxMemPoolEntry stores data about the corresponding transaction, as well * CTxMemPoolEntry stores data about the corresponding transaction, as well

View File

@ -4,7 +4,8 @@
# file COPYING or http://www.opensource.org/licenses/mit-license.php. # file COPYING or http://www.opensource.org/licenses/mit-license.php.
export LC_ALL=C export LC_ALL=C
DIR="$1" # Strip trailing / from directory path (in case it was added by autocomplete)
DIR="${1%/}"
COMMIT="$2" COMMIT="$2"
if [ -z "$COMMIT" ]; then if [ -z "$COMMIT" ]; then
COMMIT=HEAD COMMIT=HEAD