Cybersecurity career intelligence
Get weekly cybersecurity career intelligence
© 2026 Bespoke Intermedia LLC
Founded by Julian Calvo, Ed.D. · Cybersecurity career intelligence · Est. 2024
Julian Calvo, Ed.D., MBA, M.S.
Five primary research areas with annotated bibliographies. The citations here are the foundational literature behind DecipherU's assessment design, learning-path engineering, and cybersecurity career intelligence. Every citation is a peer-reviewed source or a canonical monograph used in my doctoral coursework at the University of Miami (Ed.D. Learning Sciences, defended March 2026) and master's work at Barry University (M.S. Organizational Learning and Leadership, 2017).
Last updated 2026-04-24. Citation style: APA 7th edition.
Area 01
The doctoral capstone that closed out my Ed.D. coursework at the University of Miami (defended March 3, 2026) asked whether explicit instruction and a mapping framework I developed called START could make invisible cognitive processes visible to art students working on STEM-adjacent creative tasks. The study used a mixed-methods pre/post design with n=16 and found that explicit mapping of domain vocabulary increases the ability to name the STEM processes students were already performing implicitly. The implication for cybersecurity career development is direct: adult learners crossing from non-technical backgrounds into cybersecurity routinely perform reasoning the field values (pattern recognition, threat modeling, narrative construction in incident write-ups) before they know the field calls those activities by specific names. Naming the work is often the barrier, not doing it.
Why it matters for DecipherU
DecipherU readiness assessments are built on the same insight. Adult learners can articulate SOC-analyst-grade reasoning before they acquire SOC-analyst vocabulary, and assessment design should surface the reasoning rather than gate access behind terminology.
Yakman, G. (2008). STEAM education: An overview of creating a model of integrative education. Pupils' Attitudes Towards Technology (PATT), 19, 335-358.
Yakman's original STEAM framework, which the capstone builds on. The paper argues that the isolation of disciplines in traditional schooling obscures the common cognitive moves that span them. That claim motivated the capstone's pre/post design.
Ritchhart, R., Church, M., & Morrison, K. (2011). Making thinking visible: How to promote engagement, understanding, and independence for all learners. Jossey-Bass.
The making-thinking-visible literature directly informed the capstone's instructional design. Students produced visible artifacts of reasoning at both pre and post, and the artifacts themselves were the primary data for qualitative coding.
Bequette, J. W., & Bequette, M. B. (2012). A place for art and design education in the STEM conversation. Art Education, 65(2), 40-47. https://doi.org/10.1080/00043125.2012.11519167
Bequette and Bequette make the case that arts integration is not decorative to STEM learning but structural. The paper provided theoretical grounding for the capstone's claim that arts students already perform STEM-adjacent cognition.
Henriksen, D. (2014). Full STEAM ahead: Creativity in excellent STEM teaching practices. The STEAM Journal, 1(2), Article 15. https://doi.org/10.5642/steam.20140102.15
Henriksen documents creative practice among award-winning STEM teachers, which parallels the cybersecurity instructor-design pattern at DecipherU. Excellent cybersecurity instruction tends to share the qualities Henriksen identifies: domain humility, iterative design, and comfort with ambiguity.
Area 02
Constructivism (Piaget, Vygotsky) holds that learners build knowledge through active engagement with problems, not by receiving transmitted facts. Constructionism (Papert) narrows the claim: the most durable learning happens when learners build external artifacts that others can see and critique. Cybersecurity education is almost perfectly suited to constructionist design because every SIEM query, every home lab, every detection rule, and every write-up is an artifact that exposes reasoning. My doctoral coursework at the University of Miami (TAL 704 Intro to Learning Sciences with Dr. Maria Kolovou, TAL 706 Design for Formal Learning with Dr. So Mi Kim, TAL 708 Design for Informal Learning with Dr. David Weiss, TAL 705 Design for Online Learning with Dr. Luke Hobson) situated these pedagogies in technology-enhanced contexts that map directly onto cybersecurity learning.
Why it matters for DecipherU
The DecipherU readiness assessments are constructionist instruments by design. Candidates produce written artifacts that expose reasoning, and the scoring rubric evaluates the artifact, not an instructor's interpretation of the candidate's internal state.
Papert, S. (1980). Mindstorms: Children, computers, and powerful ideas. Basic Books.
Papert's founding text for constructionism. His central argument, that learners understand what they build, underlies every cybersecurity lab environment that asks the learner to write code, query a SIEM, or publish a write-up rather than recite a fact.
Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Harvard University Press.
The source for the zone of proximal development concept central to the DecipherU Adult Learning Framework for Career Transition. Chapter 6 on the interaction between learning and development specifies why scaffolded practice works when independent practice and lecture both fail.
Jonassen, D. H. (1999). Designing constructivist learning environments. In C. M. Reigeluth (Ed.), Instructional-design theories and models (Vol. 2, pp. 215-239). Lawrence Erlbaum.
Jonassen's chapter gives a practical design template for constructivist environments that became a reference point in TAL 706. The template names the components (problem, cases, resources, scaffolding, social supports) a designer must explicitly build rather than hope emerge.
Kafai, Y. B., & Resnick, M. (Eds.). (1996). Constructionism in practice: Designing, thinking, and learning in a digital world. Lawrence Erlbaum.
Edited volume that gathers constructionist classroom studies. The contributions on design-based research informed how the DecipherU content pipeline treats each published page as an artifact whose iteration can be studied rather than a static deliverable.
Area 03
The Master of Professional Studies in Applied AI at Northeastern University (currently in progress) sits at the intersection of my instructional-design training and the question I have been asking since Barry: how does knowledge get inside a learner and change behavior? Large language models change the economics of individualized instruction, but not the principles. The research question I care about is narrow: what does it take for an AI-driven instructional system to respect the adult learning assumptions from Knowles (1970), the zone of proximal development from Vygotsky (1978), and the self-efficacy sources from Bandura (1977)? Many current AI-tutoring systems ignore those foundations and produce tooling that measurably disengages adult learners.
Why it matters for DecipherU
The DecipherU AI Career Coach at /coach is built on retrieval-augmented generation over the platform's knowledge graph, and its responses are explicitly bounded by the coursework-validated learning principles that generic chatbots ignore. This is why the Coach refuses to produce career advice in crisis situations and routes the user to hotline resources. The Coach is an instructional system, not a generic assistant.
Koedinger, K. R., Corbett, A. T., & Perfetti, C. (2012). The Knowledge-Learning-Instruction framework: Bridging the science-practice chasm to enhance robust student learning. Cognitive Science, 36(5), 757-798. https://doi.org/10.1111/j.1551-6709.2012.01245.x
Koedinger and colleagues articulate the knowledge components that AI-driven instructional systems must explicitly model. The framework is why DecipherU's readiness assessments are structured around named dimensions and keyword banks instead of treating free text as an opaque blob.
Holmes, W., Bialik, M., & Fadel, C. (2019). Artificial intelligence in education: Promises and implications for teaching and learning. Center for Curriculum Redesign.
The book gives a practitioner-ready taxonomy of AI-in-education applications and a sober critique of the personalized-learning marketing layer. It was required reading in my AI-adjacent Ed.D. coursework and informs DecipherU's skepticism toward adaptive-learning claims that do not specify the underlying knowledge components.
VanLehn, K. (2011). The relative effectiveness of human tutoring, intelligent tutoring systems, and other tutoring systems. Educational Psychologist, 46(4), 197-221. https://doi.org/10.1080/00461520.2011.611369
VanLehn's meta-analysis quantifies the effect sizes of various tutoring configurations. The result, that intelligent tutoring systems approach human tutor effectiveness when designed on knowledge components while generic tutoring systems do not, is the evidence base for why DecipherU invests in structured assessment rubrics rather than open-ended chatbot interaction.
Chi, M. T. H., Siler, S. A., Jeong, H., Yamauchi, T., & Hausmann, R. G. (2001). Learning from human tutoring. Cognitive Science, 25(4), 471-533. https://doi.org/10.1207/s15516709cog2504_1
Chi and colleagues decomposed what expert tutors do turn-by-turn. The paper is foundational for understanding why an AI coach that does not ask for reasoning before answering a question provides weaker learning gains than one that does.
Area 04
My doctoral capstone used a pre/post mixed-methods design with qualitative coding of student artifacts alongside quantitative change scores. The design choice was deliberate. Educational outcomes are underdetermined by either qualitative or quantitative data alone, and triangulation between them produces inferences that survive scrutiny better than either approach in isolation. The research-methods coursework that grounded this work spanned Barry University (ADM 535 Research Methodologies) and University of Miami (TAL 710 Intro to Research with Dr. William Carpenter and TAL 714 Intro to Qualitative Research with Dr. Matthew Deroo).
Why it matters for DecipherU
DecipherU's commitment to citing primary sources, declaring sample sizes on salary data, and naming the survey year for every figure traces directly to mixed-methods training. The platform does not aggregate data it cannot source. Editorial decisions that would compromise traceability get refused at the content pipeline.
Creswell, J. W., & Plano Clark, V. L. (2018). Designing and conducting mixed methods research (3rd ed.). SAGE Publications.
The canonical practitioner reference for mixed-methods design. Its taxonomy of convergent, explanatory sequential, exploratory sequential, and embedded designs mapped directly onto the capstone's convergent parallel structure.
Teddlie, C., & Tashakkori, A. (2009). Foundations of mixed methods research: Integrating quantitative and qualitative approaches in the social and behavioral sciences. SAGE Publications.
Teddlie and Tashakkori articulate the methodological defense of mixed-methods claims against paradigm purists. Their chapter on inference quality in mixed-methods work was the reference point I cited repeatedly in capstone committee meetings.
Johnson, R. B., & Onwuegbuzie, A. J. (2004). Mixed methods research: A research paradigm whose time has come. Educational Researcher, 33(7), 14-26. https://doi.org/10.3102/0013189X033007014
Johnson and Onwuegbuzie's argument that mixed methods constitute a distinct paradigm, not just a pragmatic combination, underlies why DecipherU reports both demographic-level aggregates and individual-level artifact evidence on the same page. Either in isolation would be weaker.
Miles, M. B., Huberman, A. M., & Saldaña, J. (2020). Qualitative data analysis: A methods sourcebook (4th ed.). SAGE Publications.
The methods sourcebook used to structure the capstone's coding schemes. Its chapter on matrix displays shaped how DecipherU visualizes assessment-dimension results on the readiness-assessment result pages.
Area 05
Sales training in industry is primarily apprentice-driven and treats methodologies like MEDDIC (Metrics, Economic Buyer, Decision Criteria, Decision Process, Identify Pain, Champion), Challenger Selling (Dixon and Adamson), and Value Selling as proprietary frameworks taught outside academic institutions. My professional practice at LeadSimple (145% of quota, designed BDR coaching curricula that increased pipeline output by 40%) combined with my instructional-design training surfaced a research question worth pursuing: what happens when industry methodologies are treated as academic curriculum, subject to the same rigor as any other professional preparation? The gap between what industry teaches and what academic sales programs teach is an underexplored research area with practical implications for workforce-development outcomes.
Why it matters for DecipherU
DecipherU's Cybersecurity Sales Mastery course (22 modules) is the applied artifact of this research direction. Unlike bootcamp sales training, the course cites peer-reviewed sources (Kahneman, Gouldner, Bandura, Dweck) alongside the methodology and situates the tactics inside seven philosophical pillars. That structure is not marketing. It is how industry knowledge becomes transferable when the conditions of a job change faster than the industry can retrain.
Dixon, M., & Adamson, B. (2011). The Challenger sale: Taking control of the customer conversation. Portfolio/Penguin.
Dixon and Adamson's empirical study of B2B sales-rep performance identifies the Challenger profile as the highest-performing archetype in complex sales. The research basis (survey of 6,000+ sales reps) makes it a legitimate academic citation despite its industry origin.
Dweck, C. S., & Leggett, E. L. (1988). A social-cognitive approach to motivation and personality. Psychological Review, 95(2), 256-273. https://doi.org/10.1037/0033-295X.95.2.256
The primary-source Dweck paper that grounds the fixed-versus-growth mindset claims repeated across sales training. Citing Dweck and Leggett (1988) anchors the work in the peer-reviewed literature rather than in the trade-press version of the idea.
Kahneman, D. (2011). Thinking, fast and slow. Farrar, Straus and Giroux.
Kahneman's treatment of System 1 and System 2 thinking, combined with his earlier prospect-theory work (Kahneman and Tversky, 1979), is the empirical foundation for much of modern B2B sales practice. Sales curricula that reference cognitive biases without citing Kahneman are unfalsifiable.
Gouldner, A. W. (1960). The norm of reciprocity: A preliminary statement. American Sociological Review, 25(2), 161-178. https://doi.org/10.2307/2092623
Gouldner's foundational sociology paper on reciprocity underlies consultative selling practice. Sales training that invokes value-first engagement without referencing the reciprocity-norm literature is operating on borrowed authority. The primary source is Gouldner.
Knowles, M. S., Holton, E. F., & Swanson, R. A. (2015). The adult learner: The definitive classic in adult education and human resource development (8th ed.). Routledge. https://doi.org/10.4324/9781315816951
Knowles' andragogical model is the missing link between sales methodology (industry) and sales pedagogy (academic). Adult sales professionals are adult learners first, and the training design that works respects the six andragogical assumptions Knowles articulated.