binaryRL: Reinforcement Learning Tools for Two-Alternative Forced Choice
Tasks
Tools for building Rescorla-Wagner Models for Two-Alternative 
  Forced Choice tasks, commonly employed in psychological research. 
  Most concepts and ideas within this R package are referenced from 
  Sutton and Barto (2018) <ISBN:9780262039246>. 
  The package allows for the intuitive definition of RL models using simple 
  if-else statements and three basic models built into this R package are 
  referenced from 
  Niv et al. (2012)<doi:10.1523/JNEUROSCI.5498-10.2012>. 
  Our approach to constructing and evaluating these computational models 
  is informed by the guidelines proposed in 
  Wilson & Collins (2019) <doi:10.7554/eLife.49547>. 
  Example datasets included with the package are sourced from the work of
  Mason et al. (2024) <doi:10.3758/s13423-023-02415-x>.
| Version: | 
0.9.7 | 
| Depends: | 
R (≥ 4.0.0) | 
| Imports: | 
Rcpp, compiler, future, doFuture, foreach, doRNG, progressr | 
| LinkingTo: | 
Rcpp | 
| Suggests: | 
stats, GenSA, GA, DEoptim, pso, mlrMBO, mlr, ParamHelpers, smoof, lhs, DiceKriging, rgenoud, cmaes, nloptr | 
| Published: | 
2025-08-19 | 
| DOI: | 
10.32614/CRAN.package.binaryRL | 
| Author: | 
YuKi   [aut, cre] | 
| Maintainer: | 
YuKi  <hmz1969a at gmail.com> | 
| BugReports: | 
https://github.com/yuki-961004/binaryRL/issues | 
| License: | 
GPL-3 | 
| URL: | 
https://yuki-961004.github.io/binaryRL/ | 
| NeedsCompilation: | 
yes | 
| CRAN checks: | 
binaryRL results | 
Documentation:
Downloads:
Linking:
Please use the canonical form
https://CRAN.R-project.org/package=binaryRL
to link to this page.