Skip to content
GitLab
Explore
Sign in
Primary navigation
Search or go to…
Project
G
gbs-downloader
Manage
Activity
Members
Labels
Plan
Issues
Issue boards
Milestones
Wiki
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Snippets
Build
Pipelines
Jobs
Pipeline schedules
Artifacts
Deploy
Releases
Package Registry
Model registry
Operate
Environments
Terraform modules
Monitor
Incidents
Analyze
Value stream analytics
Contributor analytics
CI/CD analytics
Repository analytics
Model experiments
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
This is an archived project. Repository and other project resources are read-only.
Show more breadcrumbs
PrivateStorage
gbs-downloader
Commits
bdfd89d9
Commit
bdfd89d9
authored
1 year ago
by
Jean-Paul Calderone
Browse files
Options
Downloads
Patches
Plain Diff
fix mutable share placement in the tests
parent
4b26af91
No related branches found
No related tags found
1 merge request
!5
Support SDMF in the `download` API
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
test/Spec.hs
+6
-9
6 additions, 9 deletions
test/Spec.hs
with
6 additions
and
9 deletions
test/Spec.hs
+
6
−
9
View file @
bdfd89d9
...
...
@@ -285,7 +285,7 @@ tests =
(
shares
,
cap
)
<-
liftIO
$
Tahoe
.
CHK
.
encode
key
params
ciphertext
-- Distribute the shares.
liftIO
$
placeShares
cap
(
Binary
.
encode
<$>
shares
)
perServerShareCount
servers
0
liftIO
$
placeShares
(
storageIndex
.
verifier
$
cap
)
(
Binary
.
encode
<$>
shares
)
perServerShareCount
servers
0
let
serverMap
=
Map
.
fromList
$
zip
(
Set
.
toList
serverIDs'
)
servers
lookupServer
=
someServers
serverMap
...
...
@@ -330,7 +330,7 @@ tests =
(
shares
,
writeCap
)
<-
liftIO
$
SDMF
.
encode
keypair
sequenceNumber
required
total
ciphertext
let
readCap
=
SDMF
.
writerReader
writeCap
-- Distribute the shares.
liftIO
$
place
Mutable
writeCap
(
Binary
.
encode
<$>
shares
)
perServerShareCount
servers
0
liftIO
$
place
Shares
(
SDMF
.
Keys
.
unStorageIndex
.
SDMF
.
verifierStorageIndex
.
SDMF
.
readerVerifier
.
SDMF
.
writerReader
$
writeCap
)
(
Binary
.
encode
<$>
shares
)
perServerShareCount
servers
0
let
serverMap
=
Map
.
fromList
$
zip
(
Set
.
toList
serverIDs'
)
servers
lookupServer
=
someServers
serverMap
...
...
@@ -371,14 +371,11 @@ tests =
-- Exactly match the nonsense makeAnn spits out
parseURL
=
T
.
take
2
.
T
.
drop
5
placeMutable
::
SDMF
.
Writer
->
[
BL
.
ByteString
]
->
[
Int
]
->
[
StorageServer
]
->
Int
->
IO
()
placeMutable
=
undefined
--- PHILOSOFY
-- We wish that share numbers were an opaque type instead of a
-- numeric/integral type. This is not the place to argue the point
-- though.
placeShares
::
Reader
->
[
BL
.
ByteString
]
->
[
Int
]
->
[
StorageServer
]
->
Int
->
IO
()
placeShares
::
B
.
ByteString
->
[
BL
.
ByteString
]
->
[
Int
]
->
[
StorageServer
]
->
Int
->
IO
()
-- Out of shares, done.
placeShares
_
[]
_
_
_
=
pure
()
-- Out of placement info but not out of shares is a programming error.
...
...
@@ -386,14 +383,14 @@ tests =
-- Out of servers but not out of shares is a programming error.
placeShares
_
_
_
[]
_
=
throwIO
RanOutOfServers
-- Having some of all three means we can make progress.
placeShares
cap
shares
(
n
:
ns
)
(
s
:
ss
)
sharesSoFar
=
do
placeShares
si
shares
(
n
:
ns
)
(
s
:
ss
)
sharesSoFar
=
do
-- write the right number of shares to this server
zipWithM_
(
\
shnum
share
->
storageServerWrite
s
(
storageIndex
.
verifier
$
cap
)
shnum
0
share
)
(
\
shnum
share
->
storageServerWrite
s
si
shnum
0
share
)
[
fromIntegral
sharesSoFar
..
]
(
BL
.
toStrict
<$>
take
n
shares
)
-- recurse to write the rest
placeShares
cap
(
drop
n
shares
)
ns
ss
(
sharesSoFar
+
n
)
placeShares
si
(
drop
n
shares
)
ns
ss
(
sharesSoFar
+
n
)
-- Make up a distinct (but nonsense) announcement for a given storage
-- server identifier.
...
...
This diff is collapsed.
Click to expand it.
Preview
0%
Loading
Try again
or
attach a new file
.
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Save comment
Cancel
Please
register
or
sign in
to comment