A little old lady once challenged a well-known scientist’s explanation on the structure of the universe, countering that the world is really a flat plate supported on the back of a giant tortoise. The scientist rebutted the little old lady’s challenge with one of his own by asking what the tortoise was standing on. The little old lady’s sly reply was that it’s, “turtles all the way down.” So too is software architecture “turtles all the way down”.
In this session, we cover a broad range of topics that include challenging traditional practices of software architecture, examining what it takes to bring down the ivory tower, probing the paradoxical aspects of architecture’s goal, and investigating the inextricable link between temporal decisions and structural flexibility. From the highest level applications and services to the code that exists in the bowels of the system, and everything in between, we explore how an effective software architecture must be turtles all the way down. In the end, we will all have gained deep insight to the value of agile architecture.
Modularity is coming to the Java platform! Java 8 will introduce the Jigsaw module system. OSGi is here today. Don’t wait to start designing modular software. Contrary to popular belief, you don't need a framework or a new runtime to start building modular software applications. You can start today. Learn how!
In this session, we'll examine what it means to develop modular software on the Java platform. We'll examine the goals and benefits of modular software, and explore the patterns of modular architecture that help us develop modular software systems. With just a few easy steps, we'll see how to transform our software from a huge monolith to an extensible system of collaborating software modules. By examining an existing software system, we'll see first hand how we can increase software modularity with minimal disruption. You'll walk away not just with a much deeper understanding of the benefits of modular software, but also a migration roadmap for refactoring existing applications to increase their modularity. In other words, you'll see how to get ready today for the application platform of tomorrow.
OSGi was once heralded as a contender for most important technology of the decade. Today, most developers have heard of OSGi, but few are using it to develop their enterprise software applications.Is OSGi failing? Who is using it? And what exactly are its benefits?
In this session, we'll explain the benefits of OSGi, and show that it's not just for the middleware vendor. We'll learn how you can use OSGi without making significant changes to how you write your software applications. We'll explore the OSGi ecosystem, including platforms that support OSGi. Through code examination, we'll see how the Spring framework allows us to leverage OSGi in a non-invasive way. We'll discover how OSGi encourages Polyglot programming on the Java platform. And we'll take a brief glimpse into the future of modularity on the Java platform. You'll walk away with a much better understanding of OSGi, its strengths and benefits, how to use it effectively, as well as the myths surrounding its use.
Mobile devices are now outselling PCs and Android mobile devices are leading the way. Learn how to develop for this popular and important platform. This session will provide the information you need to configure the Android development platform and begin writing native applications.
No Android experience necessary. If you already know Java and Eclipse that would be helpful but, if not, you'll still get a good understanding of the development process for Android (and you can learn Java later if you want to continue building Android apps).
This presentation will cover the following topics:
Mobile devices are fast becoming one of the preferred platforms for enterprise development and Android is the fastest selling mobile device. Prepare yourself for developing powerful business apps on Android by attending this presentation. We'll explore many of the most useful features of the Android architecture. You'll learn how to use databases, AsynTask processes and specialized views like ListView. You'll be ready to start building the new generation of mobile applications.
This presentation is intended for someone who has already built an Android app and has been introduced to the development environment. We will cover some additional topics that are important for development more sophisticated apps.
The following topics are included in this presentation:
Creating the right development environment for building Android apps should be the primary concern of a team embarking on native app development. Constructing a comprehensive build process integrating unit testing should be one of the development teams primary goals.
This session will show you how to leverage a number of mature and powerful tools to make creating your Android development process more effective.
You'll see how to use tools provided with the Android SDK to do logging, unit testing and UI testing. We'll cover some third party tools and services like Robotium for more readable unit testing. You'll see how Spoon can be used to produce HTML output and screenshots. Finally, we'll type it all together with Gradle will also be discussed.
This presentation is ideal for anyone about to join and team that needs to do Android development the right way.
The following topics are included in this presentation:
There is a good amount of excitement about the new version of Java. The big
evolution of course is the lambda expressions. In this presentation we will dive into the language features in Java 8, take a look at some of their nuances, and look at ways to put them to good use.
Java 8 language capabilities and application.
Programming concurrency has turned into a herculean task. I call the traditional approach as the synchronized and suffer model. Fortunately, there are other approaches to concurrency and you can reach out to those directly from your Java code.
In this presentation we will discuss the actor based concurrency and also the software transaction memory. We will then develop examples using AKKA and compare the power of these approaches in contrast to the traditional approach.
Relational databases have ruled the world since the dawn of time (or so it appears). They power our enterprises and for many in the corporate world, it may be hard to imagine life without them. Each decade a novel idea would challenge the status quo and make a case to deviate for the tradition. A flock of enthusiastic programmers, like your humble speaker back in the early 90s, would throw their support around it, only to be crushed eventually by the large vendors and enterprise standards. But, the excitement around NoSQL has shown that enterprise data is not the only thing that's persistent.
In this presentation we'll learn how NoSQL deviates from those deep rooted traditional approaches, and how this may be useful. We will also discuss the situations where these types of databases may be more appropriate.
After almost a decade and several significant releases, Spring has gone a long way from challenging the then-current Java standards to becoming the de facto enterprise standard itself. Although the Spring programming model continues to evolve, it still maintains backward compatibility with many of its earlier features and paradigms. Consequently, there's often more than one way to do anything in Spring. How do you know which way is the right way?
In this session, we'll explore several ways that Spring has changed over the years and look at the best approaches when working with the latest versions of Spring.
In this presentation, we'll see how to use Spring to create, secure, streamline, hyperlink, and consume REST APIs.
In modern applications, there are a diverse array of clients consuming content from the web. Each of these clients has unique capabilities and limitations, therefore demanding presentation of the application to be tailored to each device. As a result, presentation logic is often pushed into the client itself, leaving the application to serve a common data-oriented lightweight API to be consumed by each client.
So you think you've picked up enough JavaScript to be dangerous, but feel like the whole prototypical language thing is still a mystery. In this session, we'll go from basic JavaScript to advanced JavaScript. We'll discuss and code modular JavaScript with CommonJS. We'll look into the details of a prototype language and discuss things like parasitic inheritance. We'll also look at JavaScript libraries that will help you get the most out of JavaScript - not jQuery, but a library like UnderscoreJS and SugarJS.
This is a fast paced session meant to bring you up to speed with the latest and greatest JavaScript techniques and tools. Whether you're building client side JavaScript with HTML5 or Appcelerator Titanium, or server-side JavaScript with node.js, you'll come away with knowledge and patterns for how the pro's use JavaScript for building real apps.
We've come a long way down the JavaScript road. Gone are the days of 'just hack it' for the web - architecting even a small project in JavaScript can be a challenge. Thankfully, there are several frameworks to help you; the most popular currently is Backbone.js.
Before you start using a framework in JavaScript, you will want to understand the techniques expert JavaScript programmers use to build them. In this session, we'll dive into design patterns in JavaScript, and do live coding so you can see these patterns applied. Even if you're not using a framework, you can use these design patterns to make your code more maintainabile, elegant, and concise.
JavaScript has a mixed heritage: OO and Functional. To date, us developers have focused on the OO side of JavaScript and not much mind-share has been given to the other, more powerful side. In this session we'll explore how to use the power of functional Javascript.
JavaScript has elements of two distinct programming languages: Self and Scheme. These two languages are very different - and some of JavaScript's weirdness is due to this mixing of very different language designs. The conceptual models are also very different between Self and Scheme - one is a prototypical object based language, while the other is a functional language. In this session, we'll discuss the elements of how the Scheme functional programming language manifest in JavaScript. We're going to explore how you can write JavaScript in a more elegant and powerful way by applying functional concepts.
No matter where you slice software engineering:
The root cause of many, if not most problems, is the common absence of critical thinking in how we approach decision making. Instead of thinking critically about our engineering decisions, we often follow a Cargo Cult mentality or blindly follow the pronouncements of the Blowhard Jamboree. The end results all too often include suboptimal productivity, excessive spending, poor quality and cancelled projects.
When we think instead critically about a component of software engineering, we take it apart. We discard our presuppositions. We challenge tradition. We gather our own evidence. We question everything.
This talk will examine the pathologies associated with not thinking critically, including a tour of the antipatterns that can emerge from such a practice. We'll then walk through the concentric circles of the critical thinking process, including evidence evaluation, argument evaluation, and argument construction. You'll leave this session with a critical thinking framework which can be applied to software engineering as well as beyond.
Now that you've completed the “Critical Thinking in Software Engineering” lecture, it's time to put your new skills to work. In this session, we'll break up into teams. Each team will be presented with either an argument to evaluate or a problem situation for which you must construct an argument advocating a particular course of action.
Teams will then be responsible for presenting their evaluations and arguments to the group, and the group as a whole will evaluate the presentations.
Topics to include:
In modern applications, Javascript is increasingly prevalent both on the client-side and to some degree on the server-side. As we continue to crank out more Javascript code, we're finding that many of the same hard-lessons we learned in writing decoupled Java code are equally desirable in Javascript code. Without the benefit of dependency injection and AOP, both Java and Javascript code can quickly become an unnavigable and untestable mess.
Where frameworks like Spring have helped us gain control over our Java code, Cujo.js similarly aims to give our Javascript code more structure and testability.
In this session, we'll look at Cujo.js, an “unframework” that provides dependency injection that takes Javascript's unique needs into consideration to create loosely-coupled code. We'll also see how, although Cujo.js isn't strictly a UI framework, elements of Cujo.js can be brought together to elegantly build client-side UIs.
Ever wish you could use your JavaScript-foo to build a NATIVE mobile app? Wish there was an open-source platform that would let you build awesome cross-platform mobile apps? Come to this session and learn about Titanium, an open-source, JavaScript based platform for creating native mobile apps.
Titanium is an open-source development tool for producing cross-platform mobile applications by Appcelerator. Using Titanium, you develop your mobile application using Javascript coded against the Titanium API's. The Titanium platform invokes their builder to take your Javascript and build a native application for iOS and Android.
This session will walk you through the details of building great apps for the Android and iOS platforms. We'll talk about Titanium development, its ecosystem, and architecture. We'll spend time looking at lots of code - we'll build an app, in fact, while we discuss and explore the framework. We'll also spend some time discussing best practices, what to expect when developing against it, and the limits of this type of development.
The Single Page App, or SPA, require the developer thinks about architecture in new ways compared to traditional server-side page generation web apps. Whether you've used Java web frameworks like Spring Web, or Struts, dynamic frameworks like Grails or Rails, or even Django or PHP, you will need to learn some new techniques for building SPA type applications. We discuss the architecture and design of Service Oriented Front End Architectures (SOFEA) in this session.
In this session, we'll look at structuring your app from both the client and server side. Using a SPA framework like Backbone.js or Ember can help, but does not answer critical design questions. This is a fast paced session where we'll talk about SPAs, REST, SEO, and much more.
There's a bevy of options for developing mobile apps. If you're looking at cross-platform solutions, there's a multitude of options to choose from. In this session we'll explore the three basic categories for developing mobile apps: native, cross-platform-to-native, and mobile web. We'll discuss the sweet spot for each of these three approaches and the benefits and drawbacks of each. Technologies discussed include Android, iOS, HTML5/CSS3, Phonegap, Titanium, and jQuery Mobile.
There's a bevy of options for developing mobile apps. If you're looking at cross-platform solutions, there's a multitude of options to choose from. In this session we'll explore the three basic categories for developing mobile apps: native, cross-platform-to-native, and mobile web. We'll discuss the sweet spot for each of these three approaches and the benefits and drawbacks of each. Technologies discussed include Android, iOS, HTML5/CSS3, Phonegap, Titanium, and jQuery Mobile.
Getting software released to users is often a painful, risky, and time-consuming process. This workshop sets out the principles and technical practices that enable rapid, incremental delivery of high quality, valuable new functionality to users. Through automation of the build, deployment, and testing process, and improved collaboration between developers, testers and operations, delivery teams can get changes released in a matter of hours–sometimes even minutes–no matter what the size of a project or the complexity of its code base.
In this workshop we take the unique approach of moving from release back through testing to development practices, analyzing at each stage how to improve collaboration and increase feedback so as to make the delivery process as fast and efficient as possible. At the heart of the workshop is a pattern called the deployment pipeline, which involves the creation of a living system that models your organization's value stream for delivering software. We spend the first half of the workshop introducing this pattern, and discussing how to incrementally automate the build, test and deployment process, culminating in continuous deployment.
Getting software released to users is often a painful, risky, and time-consuming process. This workshop sets out the principles and technical practices that enable rapid, incremental delivery of high quality, valuable new functionality to users. Through automation of the build, deployment, and testing process, and improved collaboration between developers, testers and operations, delivery teams can get changes released in a matter of hours–sometimes even minutes–no matter what the size of a project or the complexity of its code base.
In this workshop we take the unique approach of moving from release back through testing to development practices, analyzing at each stage how to improve collaboration and increase feedback so as to make the delivery process as fast and efficient as possible. At the heart of the workshop is a pattern called the deployment pipeline, which involves the creation of a living system that models your organization's value stream for delivering software. We spend the first half of the workshop introducing this pattern, and discussing how to incrementally automate the build, test and deployment process, culminating in continuous deployment.
Robert Martin assembled the SOLID family of principles to provide a useful guide to help us create object-oriented software designs that were resilient in the face of change. In recent years, the need to write highly-concurrent software in order to leverage increasingly ubiquitous multicore architectures, as well as general interest in more effectively controlling complexity in large software designs, has driven a renewed interest in the functional programming paradigm. Given the apparent similarity in their goals, “What is the intersection of SOLID with functional programming?” is a natural question to ask.
In this talk, we'll explore this intersection. We'll begin with a tour of the evolutionary patterns associated with enterprise software and programming paradigms, as well as take a look at the ongoing quest for best practices, the goal being to elucidate the motivation for examining this intersection of SOLID and functional programming. We'll then walk through each of the SOLID principles, examining them in their original object-oriented context, and looking at example problems and solutions using the Java language. Then for each principle, we'll examine its possible intersection with the functional programming paradigm, and explore the same problems and solutions using the Clojure language. We'll close by examining the transcendent qualities of the SOLID principles and how they can make any design simpler, regardless of the programming paradigm employed.
For much of the last two years I've delivered a two-part series at NFJS shows entitled “Effective Java Reloaded.” For all pracical purposes, it is an ala carte style rehash of the book Effective Java, written by Josh Bloch. One of my favorite parts of the discussion is of Item #15, which tells us to “Minimize Mutability.” If we turn this inside out, we're actually saying that we want to MAXIMIZE IMMUTABILITY. When we do this, we reap many benefits, such as code that is easier to reason about and that is inherently thread-safe. This can carry us a long way in the direction of program correctness and decreased complexity. However, when we start to program with immutability, several major questions arise.
First, the necessity of using a separate object for each distinct value, never reusing, or “mutating” an object, can quickly cause performance concerns. These concerns are amplified when we're talking about large collections such as lists and maps. These problems are largely solved by what we call “persistent data structures.” Persistent data structures are collections from which we create new values, not by copying the entire data structure and apply changes, but by creating a new structure which contains our changes but points at the previous structure for those elements which have not changed. This allows us to work with data structures in a very performant way with respect to time and resource consumption. We'll examine persistent data structures, their associated algorithms, and implementations on the JVM such as those found in the TotallyLazy library.
Second, because all of an immutable object's state must be provided at the time of construction, the construction of large objects can become very tedious and error prone. We'll examine how the Builder pattern can be applied to ease the construction of large objects, and we'll examine Builder implementations in Java and Groovy.
Third, we run into problems when we start to use frameworks that expect us to program in a mutable style. A prime example is Hibernate, which expects our persistent classes to follow the well-worn JavaBean convention, including a no argument constructor and getters and setters for each property. Such a class can never be mutable! So how do we program with frameworks such as Hibernate and yet still minimize mutability? The key is found in not letting frameworks dictate the way that you design your code. Just because the framework require something, don't let it force you to make the wrong decision. Use the framework as a tool to write your code, don't let your code be a tool of the framework. We'll examine strategies for doing exactly that.
You should come away from this talk better equipped to program in a way that minimizes mutability and maximizes immutability.
Have you ever wished that your local development sandbox could look exactly like production, but you've got a mismatch between your local OS and your production OS? And what about the age old “it works on my machine” excuse that quite often stems from differences between developer sandboxes? Many have turned to virtualization, creating a machine image that can be passed around the team. But who manages the template? How do you keep things in sync?
In this session, we'll explore Vagrant (http://www.vagrantup.com), an open source tool that allows you to easily create and manage virtual development environments that can be provisioned on demand and “thrown away” when no longer needed.
Vagrant is most powerful when we think of it as a tool to enable various workflows that are useful to software development teams. In this talk, we'll walk through the following workflows and examine Vagrant's contributions:
This is the first in a new series on resource-oriented systems. The goal of the series is to provide practical guidance on the design and implementation of next generation systems that are flexible, extensible, high-performance and future-friendly. The talks are designed to work as arc, building upon each other, but they should also stand alone. The first topic is a guided walk through of building quality REST APIs.
We will focus on the architecture of the Web and how it can help us model and manipulate our important business concepts. We will discuss the role of stable identifiers, intentional representation design, hypermedia affordances and architectural consistency. The goal is not to be “RESTful”, the goal is to build systems that display the properties we require.
This talk will be accessible for people new to REST, but also different enough that those who have attended previous REST talks will learn new things.
This is the second in a new series on resource-oriented systems. The goal of the series is to provide practical guidance on the design and implementation of next generation systems that are flexible, extensible, high-performance and future-friendly. The talks are designed to work as arc, building upon each other, but they should also stand alone. This second talk is an introduction to the use of Semantic Web technologies to enable collaboration without coordination.
REST is a means to an end, but it is not a satisfactory end state. It usually pushes complexity to the client in ways that make data integration difficult across multiple sources. The W3C Semantic Web initiative introduces us to new technologies for linking resources and querying across them in powerful new ways. We will learn about the RDF model, what it brings to the table and how we can use it connect information regardless of where and how it is stored. We will use the SPARQL protocol and query language to ask powerful questions of arbitrary resources. We will also see how we can create new information just by asking for it.
This is the third in a new series on resource-oriented systems. The goal of the series is to provide practical guidance on the design and implementation of next generation systems that are flexible, extensible, high-performance and future-friendly. The talks are designed to work as arc, building upon each other, but they should also stand alone. This third talk will introduce you to RDFa, one of the most exciting technologies estimated to be used on at least 25% of the indexed Web.
We understand that documents contain information, but it is usually only accessible to humans if they know where and how to find them. What if we could automatically extract arbitrary information about arbitrary domains and connect it to information held elsewhere? What if we could use the information in a document to help us organize our content better? What if this embedded information could help external search engines index the public Web better and improve your rankings?
This talk will show you how to weave and extract information in HTML, XHTML and arbitrary XML using standard tools such as RDFa. In the process, you will learn how to free the information so that it may be reused in powerful and unanticipated ways.
If you're not terrified, you're not paying attention.
Publishing information as webs of data does not require us to just give it away. We have a series of tools and techniques for managing identity, authentication, authorization and encryption so we only share content with those we trust.
Before we tackle Web Security, however, we need to figure out what we mean by Security. We will pull from the worlds of Security Engineering and Software Security to lay the foundation for technical approaches to protecting our web resources.
The surge of interest in the REpresentational State Transfer (REST) architectural style, the Semantic Web, and Linked Data has resulted in the development of innovative, flexible, and powerful systems that embrace one or more of these compatible technologies. However, most developers, architects, Information Technology managers, and platform owners have only been exposed to the basics of resource-oriented architectures.
This talk, based upon Brian Sletten's book of the same name, is an attempt to catalog and elucidate several reusable solutions that have been seen in the wild in the now increasingly familiar “patterns” style. These are not turn key implementations, but rather, useful strategies for solving certain problems in the development of modern, resource-oriented systems, both on the public Web and within an organization's firewalls.
In our industry, we have a problem. It's called the Software Problem. It is an embarrassing indictment of our capacity to deliver quality software on time and under budget. Beyond that, when we do deliver running code, it is often fragile and hard to extend. There are many reasons for this and many solutions. But one that does not get enough attention is how we approach information.
This is a theoretical discussion. You may not learn something you can use right away, but you may learn new ways of thinking about designing and building systems with an information-centric focus. We will discuss the roles of databases, services, software models, REST, the Web and the roles they all play together.
Your application has state. Every action that every user takes changes this state. Yet, so much time and effort is spent on trying to persist something that is so transient. What if you didn't need to bother with the slowness of “persistence”? What if you could free the core of your application from I/O?
I will look at a couple of patterns that question one of the basic assumptions of today's architectures - that persistence of state is central to your application.
Find yourself overwhelmed with hundreds of to-dos? Is your hard-drive littered with dozens of killer ideas that you started with enthusiasm and then just fizzled away? Do you feel like you are moving as fast as can but only getting to the wrong place quicker? Well perhaps this session will help.
There are various techniques and strategies available to us today that aim to help with exactly this conundrum - from Getting Things Done ™ to Personal Kanban. Unfortunately it is often easy to be extremely productive using these systems, but not very effective. After all, it's not about getting things done, but getting the RIGHT things done. In this talk we will discuss not only how to get things done, but also attempt to figure out what it is you actually need to be doing.
In this session, I will attempt to show you how you can leverage various strategies to be more effective, knock to-dos out and have fun while doing it. If time permits we will close with an overview of the tools that are available to you, and how you can use these to become a to-do list ninja :)
In this session we will look to see how we can refactor our learning - what tools, and methodologies can we use to help us learn quicker and better - how we can create a store that gives us quick access to information when we really need it.
We all work in an industry in which not only do the tools that we use change ever few years, but one in which we have to shift the very paradigms these tools are built on! Even the most trivial of projects entails tens of different toolkits, frameworks, and languages coming together, and somehow we need to know how to leverage each one. How does one keep up? Despite all our years in schools, and our in-born nature to learn, we often are never taught how to learn. How can we learn faster, and retain even more?
In this session we will take a look at various tools and techniques available to us and see how we can make our learning effective.
You may have noticed today's web applications involve more than a few lines of JavaScript. You've probably also figured out JavaScript lacks certain…features…that make writing non-trivial applications more challenging. How do we resolve this conundrum? Luckily for us, we can leverage libraries like Backbone add some structure to our code. Backbone brings the concepts of the model view controller pattern we've applied to the server for years to the browser.
In this workshop, we'll introduce the idea of asynchronous user interfaces and show how Backbone helps us write that style of application. We'll work our way up from the bottom building a simple application along the way. We'll create models, we'll use a templating library (or two) and we'll also explore Underscore - a JavaScript utility belt you can use right now today without committing to building MVC style web applications.
If you're struggling to manage an increasing amount of JavaScript or you want to build more responsive web applications, this workshop can help!
For this workshop you should have:
Optionally, you can have Git installed but it isn't required. Before the workshop, take a minute to setup the code somewhere on your laptop. The code is on github:
https://github.com/ntschutta/backbone_workshop
You can clone the repo from there or simply download a zip if you prefer. That's it! See you in Florida!
You may have noticed today's web applications involve more than a few lines of JavaScript. You've probably also figured out JavaScript lacks certain…features…that make writing non-trivial applications more challenging. How do we resolve this conundrum? Luckily for us, we can leverage libraries like Backbone add some structure to our code. Backbone brings the concepts of the model view controller pattern we've applied to the server for years to the browser.
In this workshop, we'll introduce the idea of asynchronous user interfaces and show how Backbone helps us write that style of application. We'll work our way up from the bottom building a simple application along the way. We'll create models, we'll use a templating library (or two) and we'll also explore Underscore - a JavaScript utility belt you can use right now today without committing to building MVC style web applications.
If you're struggling to manage an increasing amount of JavaScript or you want to build more responsive web applications, this workshop can help!
For this workshop you should have:
Optionally, you can have Git installed but it isn't required. Before the workshop, take a minute to setup the code somewhere on your laptop. The code is on github:
https://github.com/ntschutta/backbone_workshop
You can clone the repo from there or simply download a zip if you prefer. That's it! See you in Florida!
The word just came down from the VP - you need a mobile app and you need it yesterday. Wait, you've never built a mobile app…it's pretty much the same thing as you've built before just smaller right? Wrong. The mobile experience is different and far less forgiving. How do you design an application for touch? How does that differ from a mouse? Should you build a mobile app or a mobile web site? This workshop will get you started on designing for a new, and exciting, platform. Whether that means iPhone, Android, Windows Phone or something else, you need a plan, this talk will help.
We'll look at some popular web sites discussing what we would do differently in a mobile context and then take a look at the actual mobile experience to see what other designers actually did. Using paper, we'll work though a design or two of our own. We'll wrap up discussing various methods of creating a mobile app - should we use the web or build something native? What about shell apps? While we might not have all the answers, at the end of this workshop you'll know what questions to ask when thinking through your own situation.
This workshop is more analog than digital - a laptop is not required. A pen or pencil though, is!
The word just came down from the VP - you need a mobile app and you need it yesterday. Wait, you've never built a mobile app…it's pretty much the same thing as you've built before just smaller right? Wrong. The mobile experience is different and far less forgiving. How do you design an application for touch? How does that differ from a mouse? Should you build a mobile app or a mobile web site? This workshop will get you started on designing for a new, and exciting, platform. Whether that means iPhone, Android, Windows Phone or something else, you need a plan, this talk will help.
We'll look at some popular web sites discussing what we would do differently in a mobile context and then take a look at the actual mobile experience to see what other designers actually did. Using paper, we'll work though a design or two of our own. We'll wrap up discussing various methods of creating a mobile app - should we use the web or build something native? What about shell apps? While we might not have all the answers, at the end of this workshop you'll know what questions to ask when thinking through your own situation.
This workshop is more analog than digital - a laptop is not required. A pen or pencil though, is!
Git, at it's core, leverages a relatively simple data structure to maintain history. In this session we will take a look at this data-structure, which in turn will give us a better view of how Git manages history, and how better to work with it. NOTE: This is NOT an introduction to Git. This session assumes familiarity with Git concepts such as init, add, commit and merge.
Git has fast emerged as one of the leaders in DVCS. Git may seem arcane, but under the covers, leverages a very simple data-structure to store your version history. As developers, it has always serves us well to know how things fundamentally work, and Git is no different. In this talk we will explore this data-structure, and how the various commands you invoke against Git mutate it.
In this session, we will take a look at Angular - a new MVC framework by Google. We will discuss some of the terminology that Angular offers, and see how we can use that to develop highly interactive, dynamic web applications. See “Detail” for a list of topics I cover and the Github repo URL
This is an intro-level talk we will take a look at Angular and developing rich web applications. Angular embraces HTML and CSS, allowing you to extend HTML towards your application, and uses plain JavaScript which makes your code easy to reuse, and test.
Note: This is an intro level talk. It is targeted towards developers who are curious about Angular and want to learn about the fundamental features and concepts in Angular.
Topics Covered -
ng-app
ng-init
and the evaluation {{ }}
directive$rootScope
ng-model
$scope
)ng-repeat
ng-form
, form validation and submission in AngularJS$http
GitHub URL - https://github.com/looselytyped/angudone-backend/tree/solutions
Groovy isn't designed to replace Java – it just makes Java cleaner and easier to develop. This presentation will look at various tasks Java developers need to do and demonstrate ways Groovy can help.
Topics will include building and testing applications, accessing both relational and NoSQL databases, working with web services, and more.
The Spock framework brings simple, elegant testing to Java and Groovy projects. It integrates cleanly with JUnit, so Spock tests can be integrated as part of an existing test suite. Spock also includes an embedded mocking framework that can be used right away.
In this presentation, we'll look at several examples of Spock tests and review most of its capabilities, including mock objects and integration with Spring.
Have you looked into Scala? Scala is a new object-functional JVM language. It is statically typed and type inferred. It is multi-paradigm and supports both object oriented and functional programming. And it happens to be my favorite programming language.
If you are interested in Scala, how you are planning to learn Scala? You probably are going to pick up a book or two and follow through some examples. And hopefully some point down the line you will learn the language, its syntax and if you get excited enough maybe build large applications using it. But what if I tell you that there is a better path to enlightenment in order to learn Scala?
Scala Koans, a set of test cases that will teach you Scala language. The Scala koans will help the audience learn the language, syntax and the structure of the language through test cases. It will also teach the functional programming and object oriented features of the language. Since learning is guided by failing tests it allows developers to think and play with the language while they are learning.
Namaste,
For those planning to attend the Scala Koans…
Welcome to Scala Koans!
Scala Koans is an interactive session that puts the programming and learning in your hands. Therefore, a laptop is required by all participants. If you do not have a laptop, then perhaps you have a friend with a laptop, is so, well, that would work too. In order to participate in the Scala Koan endeavor, a few things are required:
The process of actually running the koans will be covered during the session. Unfortunately, Internet connectivity is sometimes a dicey affair and at times it can rain on our parade. To avoid having to wait for the install at the conference you can prepare for the koans before the conference! If you don't have the opportunity to do this, we will have either memory sticks or private networks at the conference.
If you want to get started with the set up:
Before attending the koans session, you may want to take the opportunity to load some Scala Plugins onto your favorite IDE and Editor. Below is a list of resources that you can use to enhance your environment so that you can enjoy Scala syntax highlighting and other helpful tools like refactoring, debugging and analysis.
Eclipse - The Eclipse has an IDE plugin for Scala called aptly scala-ide. All the information about the plugin can be found at http://scala-ide.org including an easy to follow along video located at http://scala-ide.org/docs/current-user-doc/gettingstarted/index.html. Scala-IDE is also available at the Eclipse Marketplace!
IntelliJ - IntelliJ has a Scala plugin that can be found by going to Settings -> Plugins, clicking on 'Browse Repositories' button and searching for the 'Scala' plugin on the left. Right click on the 'Scala' and choose 'Install'. IntelliJ will prompt you to restart the IDE, do so, and enjoy.
NetBeans - Currently, Github user 'dcaoyuan' hosts a NetBeans Scala plugin at the address: https://github.com/dcaoyuan/nbscala. I have not tried this out since the number of NetBeans users has shrunk in recent years. If you are an avid NetBeans user, and wish to try it, you can let me know the results during the session. There is additional information at: http://wiki.netbeans.org/Scala
Emacs - Github user 'aemoncannon' has created 'ENSIME' (ENhanced Scala Interaction Mode for Emacs) at the address and has a great following. https://github.com/aemoncannon/ensime with some documentation at http://aemoncannon.github.io/ensime.
VIM - For VIM users you can use https://github.com/derekwyatt/vim-scala as a VIM plugin that offers Scala color highlighting
That is it. Hope to see you soon.
Have you looked into Scala? Scala is a new object-functional JVM language. It is statically typed and type inferred. It is multi-paradigm and supports both object oriented and functional programming. And it happens to be my favorite programming language.
If you are interested in Scala, how you are planning to learn Scala? You probably are going to pick up a book or two and follow through some examples. And hopefully some point down the line you will learn the language, its syntax and if you get excited enough maybe build large applications using it. But what if I tell you that there is a better path to enlightenment in order to learn Scala?
Scala Koans, a set of test cases that will teach you Scala language. The Scala koans will help the audience learn the language, syntax and the structure of the language through test cases. It will also teach the functional programming and object oriented features of the language. Since learning is guided by failing tests it allows developers to think and play with the language while they are learning.
Namaste,
For those planning to attend the Scala Koans…
Welcome to Scala Koans!
Scala Koans is an interactive session that puts the programming and learning in your hands. Therefore, a laptop is required by all participants. If you do not have a laptop, then perhaps you have a friend with a laptop, is so, well, that would work too. In order to participate in the Scala Koan endeavor, a few things are required:
The process of actually running the koans will be covered during the session. Unfortunately, Internet connectivity is sometimes a dicey affair and at times it can rain on our parade. To avoid having to wait for the install at the conference you can prepare for the koans before the conference! If you don't have the opportunity to do this, we will have either memory sticks or private networks at the conference.
If you want to get started with the set up:
Before attending the koans session, you may want to take the opportunity to load some Scala Plugins onto your favorite IDE and Editor. Below is a list of resources that you can use to enhance your environment so that you can enjoy Scala syntax highlighting and other helpful tools like refactoring, debugging and analysis.
Eclipse - The Eclipse has an IDE plugin for Scala called aptly scala-ide. All the information about the plugin can be found at http://scala-ide.org including an easy to follow along video located at http://scala-ide.org/docs/current-user-doc/gettingstarted/index.html. Scala-IDE is also available at the Eclipse Marketplace!
IntelliJ - IntelliJ has a Scala plugin that can be found by going to Settings -> Plugins, clicking on 'Browse Repositories' button and searching for the 'Scala' plugin on the left. Right click on the 'Scala' and choose 'Install'. IntelliJ will prompt you to restart the IDE, do so, and enjoy.
NetBeans - Currently, Github user 'dcaoyuan' hosts a NetBeans Scala plugin at the address: https://github.com/dcaoyuan/nbscala. I have not tried this out since the number of NetBeans users has shrunk in recent years. If you are an avid NetBeans user, and wish to try it, you can let me know the results during the session. There is additional information at: http://wiki.netbeans.org/Scala
Emacs - Github user 'aemoncannon' has created 'ENSIME' (ENhanced Scala Interaction Mode for Emacs) at the address and has a great following. https://github.com/aemoncannon/ensime with some documentation at http://aemoncannon.github.io/ensime.
VIM - For VIM users you can use https://github.com/derekwyatt/vim-scala as a VIM plugin that offers Scala color highlighting
That is it. Hope to see you soon.
Presentation on Akka. A set various tools to write concurrent, fault-tolerant applications using immutable data, asyncronous message passing using local and remote actors, software transactional memory, and supervised systems.
Akka is a middleware, but it is not your 1990s middleware. Akka is a set of various tools to write concurrent, fault-tolerant applications using immutable data, asyncronous message passing using local and remote actors, software transactional memory, and supervised systems. Akka is also part of the Typesafe stack, a stack that include the Play web framework and the Scala programming language. This Akka presentation will cover both Scala and Java style usage of Akka and give the audience a 30k view of how it comes together. While this presentation is not interactive, all demos will be available on github for those that want to “play” along with their laptops.
The software industry changes rapidly, but you can protect yourself
from these changes by creating code that is complicated enough that
only you can maintain it.
Of course you should not engage in obvious bad practices. The good
news is that you don't have to. You can follow idiomatic industry
practice and stay buzzword compliant with the latest trends, while
quietly spreading complexity throughout systems. Better yet, the
symptoms will show up not in your own code, but in other code that
uses your code, directly or indirectly. You will be a hero as you
lead larger and larger teams burning the midnight oil to keep systems
alive.
Practice these principles, and your code will have an
infectious complexity that guarantees you will always be needed to
maintain it.
The key to understanding Clojure is ideas, not language constructs.
In this talk, we will approach Clojure via 10 Big Ideas.
Each of these ideas is valuable and useful a la carte, not necessarily
only in a Clojure together. Taken together, they beging to fill in the
picture of why Clojure is changing the way many programmers think
about software development.
Traditional automated testing approches combine input generation, execution, output capture, and validation inside the bodies of single functions. Generative testing approaches gain expressive power by isolating these steps.
With generative testing:
There are a number of benefits to this approach:
This talk introduces test data generation and generative testing, using for its examples the data.generators and test.generative libraries developed by the author.
Simulation allows a rigorous, scalable, and reproducible approach to testing. The separation of concerns, and the use of a versioned, time-aware database, give simulation great power. This talk will introduce simulation testing, walking through a complete example using Simulant, an open-source simulation library.
Simulation allows a rigorous, scalable, and reproducible approach to testing:
Simulation begins with statistical models of the use of your system. This model includes facts such as “we have identified four customer profiles, each with different browsing and purchasing patterns” or “the analytics query for the management report must run every Wednesday afternoon.” Models are versioned and kept in a database.
The statistical models are used to create activity streams. Each agent in the system represents a human user or external process interacting with the system, and has its own timestamped stream of interactions. With a large number of agents, simulations can produce the highly concurrent activity expected in a large production system.
Agents are scaled across as many machines as are necessary to both handle the simulation load, and give access to the system under test. The simulator coordinates time, playing through the activity streams for all the agents.
Every step of the simulation process, including modeling, activity stream generation, execution, and the code itself, is captured and stored in a database for further analysis. You will typically also capture whatever logs and metrics your system produces.
Since all phases of a simulation are kept in a database, validation can be performed at any time. This differs markedly from many approaches to testing, which require in-the-moment validation against the live system.
The separation of concerns above, and the use of a versioned, time-aware database, gives simulation great power. Imagine that you get a bug report from the field, and you realize that the bug corresponds to a corner case that you failed to consider. With a simulation-based approach, you can write a new validation for the corner case, and run that validation against your past simulation results, without ever running your actual system.
This talk will introduce simulation testing, walking through a complete example using Simulant, an open-source simulation library.
Time is very precious and is often threatened by phone calls, emails, co-workers, bosses, and most of all, yourself. The Pomodoro Technique reigns in unfocused time and gives your work the urgency and the attention it needs, and it's done with a kitchen timer.
In this presentation we discuss how to set up, estimate time, log time, deal with interruptions, and integrate with Agile as a team. We discuss timer software and even some of the great health benefits of the Pomodoro Technique.
Today’s interconnected world requires that organizations rapidly deliver flexible-integrated solutions. The conventional approach is to integrate heterogeneous applications using web services but unfortunately that tends to tightly couple those applications. In this session we will explore several alternatives for achieving Enterprise Integration Agility.
Public Web APIs are increasing at an exponential rate resulting in an ever more connected web. This connected contagion is not just relegated to the domain of Web 2.0 but has infected the corporate world. In fact, companies are becoming more reliant on Software as a Service (SAAS) to provide key business functions.
Combating this contagion requires an approach that provides a type of insurance against constant change and lays the foundation for evergreen enterprise solutions. In this session we will explore three popular architectural styles including Message Oriented, Service Oriented, and Resource Oriented Architecture that are used to achieve Enterprise Integration Agility. In addition, I will provide examples of each architectural style using ActiveMQ/Camel, Mule ESB, and NetKernel.
In this session, I will demonstrate several concurrent processing techniques including Fire and Forget, Fork-Join, Producer-Consumer, and Asynchronous Web Services using the Java Concurrency Library, the Akka Framework and the Spring Framework.
Traditional concurrent development on the Java Platform requires in depth knowledge of threads, locks, and queues. Fortunately, new languages and frameworks that run on the Java Platform have made concurrent processing easier. This session apply concurrent processing patterns and techniques using several popular libraries and frameworks.
In this session you will learn to strategically introduce technology innovations by applying specific change patterns to groups of individuals. Using these patterns and related techniques will not only benefit your organization but will ultimately benefit your career as a technologist by making you a better influencer, writer, and speaker.
The rapid pace of technological innovation has enabled many organizations to dramatically increase productivity while at the same time decrease their overall headcount. However, the vacillating global economy combined with “change fatigue” within organizations has resulted in a risk averse culture. In such an environment how can one possibly introduce and inculcate the latest technology or process within an organization? The answer is to have a solid understanding of Diffusion Theory and to leverage Patterns of Change.
Prezi Location: http://prezi.com/b85wwmw7hccn
In this RESTful Imaginarium you will learn about about the core concepts of REST demonstrated through leading RESTful web service frameworks, Jersey (JAX-RS), Restlet, Spring MVC and NetKernel. During this daydream you will learn about the fallacies of URL parameters, the debate of PUT vs. POST and the power of HATEOAS.
RESTful web services have become the preferred approach to synchronously integrate heterogeneous systems. The REST Architectural Style’s success is due in large part to its simplicity and the fact that it is based based on a small set of widely accepted standards, such as HTTP. Furthermore REST requires far fewer development steps, toolkits and execution engines than conventional SOAP web services.
This session covers the core concepts of REST and then walks through how to design and implement RESTful web services using leading RESTful web service frameworks, Jersey (JAX-RS), Restlet, Spring MVC and NetKernel.
Getting software released to users is often a painful, risky, and time-consuming process. This workshop sets out the principles and technical practices that enable rapid, incremental delivery of high quality, valuable new functionality to users. Through automation of the build, deployment, and testing process, and improved collaboration between developers, testers and operations, delivery teams can get changes released in a matter of hours–sometimes even minutes–no matter what the size of a project or the complexity of its code base.
In the second half of the workshop, we introduce agile infrastructure, including the use of Puppet to automate the management of testing and production environments. We'll discuss automating data management, including migrations. Development practices that enable incremental development and delivery will be covered at length, including a discussion of why branching is inimical to continuous delivery, and how practices such as branch by abstraction and componentization provide superior alternatives that enable large and distributed teams to deliver incrementally.
Getting software released to users is often a painful, risky, and time-consuming process. This workshop sets out the principles and technical practices that enable rapid, incremental delivery of high quality, valuable new functionality to users. Through automation of the build, deployment, and testing process, and improved collaboration between developers, testers and operations, delivery teams can get changes released in a matter of hours–sometimes even minutes–no matter what the size of a project or the complexity of its code base.
In the second half of the workshop, we introduce agile infrastructure, including the use of Puppet to automate the management of testing and production environments. We'll discuss automating data management, including migrations. Development practices that enable incremental development and delivery will be covered at length, including a discussion of why branching is inimical to continuous delivery, and how practices such as branch by abstraction and componentization provide superior alternatives that enable large and distributed teams to deliver incrementally.
This multi-disciplinary session takes a deep dive into
the confluence of topics required to fully understand the intersection
of Continuous Delivery and architecture, including evolutionary
architecture and emergent design, with an emphasis on how
architectural decisions affect the ease in changing and evolving your
code, the role of metrics to understand code, how Domain Driven
Design's Bounded Context reifies in architecture, how to reduce
intra-component/service coupling, and other techniques.
Continuous Delivery is a process for automating the production
readiness of your application every time a change occurs – to code,
infrastructure, or configuration. In the Continuous Delivery world,
rather than hone skills at predicting the future via Big Design Up
Front, Continuous Delivery emphasizes techniques for understanding
and changing code with less cost during the process. Some architectures
and engineering practices yield better designs for this
environment. This multi-disciplinary session takes a deep dive into
the confluence of topics required to fully understand the intersection
of Continuous Delivery and architecture, including evolutionary
architecture and emergent design, with an emphasis on how
architectural decisions affect the ease in changing and evolving your
code, the role of metrics to understand code, how Domain Driven
Design's Bounded Context reifies in architecture, how to reduce
intra-component/service coupling, and other techniques.
In this session, we're going to combine the magic of Spring Boot and the magic of Spring Data to yield something even more powerful. You'll see how to quickly build an application's persistence layer, whether it stores data in a RDBMS, Mongo, Neo4j, or several other popular data stores. You'll also see how to create a functioning REST API with nothing more than an interface and a domain type.
Spring Boot dramatically simplifies application development with Spring. But before Spring Boot came along, Spring Data was already making developers' lives easy when it comes to working with data. When combined, Spring Data and Spring Boot can make data persistence the easiest part of your application.
In recent years, the cloud has gone from Larry Ellison's “Maybe I'm an idiot, but I have no idea what anyone is talking about,” to Microsoft's “TO THE CLOUD!” to a central part of many companies IT strategy. At the same time, the way that we consume the cloud has continued to evolve. Many of today's cloud efforts revolve around utilization of various “infrastructure as code” products (e.g. Puppet and Chef) and homegrown automation to create deployment pipelines. When we start at this level, we often end up reinventing many of the same wheels as we climb the abstraction ladder.
Platform as a Service (PaaS) offerings are positioned to allow developers (and operators) to start climbing the abstraction ladder from a higher rung, shifting the model from machine-centric deployment to application-centric deployment. This session will focus on life as an application developer using Cloud Foundry as a PaaS, with demos using Pivotal's Hosted CF at http://run.pivotal.io.
We'll cover the following topics:
All parts will require:
Needless to say you'll need a laptop! For best experience, a Mac or Linux environment is ideal (it's technically possible to get most of this working on Windows, but it's VERY DIFFICULT). Also, you'll need at least 8 GB of RAM to support running Cloud Foundry locally without too much of a performance hit.
Part one of the workshop will require a free account on Pivotal Web Services. You can create a 60-day trial with no credit card at https://run.pivotal.io.
Parts two and four will require the use of BOSH Lite. More instructions will be provided at the conference. But, to ease setup, please install the following dependencies (in addition to those for part one!):
An alternative to setting up BOSH Lite if you have trouble with the environment (or you are stuck on Windows!) is to use TryCF. This only requires an AWS account (obviously a credit card required) and an email address, and eliminates the need for this entire dependency list. The estimated cost is 28 cents per hour, so you won't run up a huge bill!
Part Three will require the use of a FREE Cloudbees account. Create one at https://grandcentral.cloudbees.com. Once you're set up, click on “Get Started with Builds” to provision your Jenkins instance. We'll complete the remainder of the setup during the lab, but it's important to get the Jenkins instance provisioned in advance so that you don't lose 10 minutes of workshop time!
It will also require the use of a FREE TRIAL account of Artifactory Online from JFrog. You can obtain an account at https://www.jfrog.com/registration/registration.html. This one is a bit quicker, but you still should try to create it before the workshop.
Feature requests are steadily pouring in, but the team cannot respond to them. They are paralyzed. The codebase on which the company has “bet the business” is simply too hard to change. It's your job to clean up the mess and get things rolling again. Where do you begin? Your first task is to get the lay of the land by applying a family of techniques we'll call “Code Archaeology.”
In this session we will learn how to systematically explore a codebase. We'll look at what tools are available to help us out, slinging some wicked shell-fu along the way. We'll look at “code islands” and “code bridges,” and how to construct a “map of the code.” We'll also examine the wisdom that thought leaders like Michael Feathers and Dave Thomas have leant to this subject.
Once we've gained a thorough understanding of what we have in front of us, we'll learn approaches for getting the system under test and refactoring so that we can start to pick up the pace and respond to user requirements without making a bigger mess. You'll leave this session well prepared to tackle the next “big ball of mud” that gets dumped on your desk.
You can program higher order functions in Groovy quite easily using closures. But the benefits of closures go far beyond that. Groovy has a variety of capabilities hidden in closures.
In this presentation, we will unlock that treasure and explore ways in which we can design applications using Groovy closures, to apply different design patterns, to create fluent interfaces, and even program asynchrony.
Invoke dynamic is a Java 7 feature that had the most impact at the bytecode level and also in terms of performance. First perceived as a feature to help dynamically typed languages on the JVM, it turned into a powerful feature that has been exploited quite a bit in the implementation of features in the Java
language itself.
In this presentation we will understand what problem this features really solves and how it has influenced other features in the Java language and on the Java platform.
This is a revised and updated version of the previous talk, with current thinking from practice and the literature. The talk presents why conflicts with your manager are inevitable based on differences in priorities and perspectives, and how to plan for them. The goal is to show you how to build the loyalty relationship that allows you to get what you need when you need it.
Topics covered will include diagnosing communication styles, lessons from game theory, working within the organizational hierarchy, and lessons on how to build a relationship with your manager that still allows you the freedom to express yourself and what you really want.
The Gradle build tool is one of the most successful projects in the Groovy ecosystem because it addresses a difficult problem – every major build is a custom build. Gradle builds are written in Groovy, so the full power of the language is available if you need it. Gradle supports Maven project structure and repositories and uses Ivy dependency management without being bound by their normal constraints. With major systems like Grails, Hibernate, and the Spring Framework moving to Gradle, this is a technology worth taking the time to understand.
This talk will cover the basics of Gradle both through simple examples and by examining the build files for major open source projects.
Monolithic applications are difficult to understand, maintain, and extend with new features and functionality. Modularity helps achieve these goals. Unfortunately, few applications have been designed with modularity in mind. In this workshop, we take a deep dive into modularity.
In part 1, we'll start developing a software system using several of the patterns of modular architecture. We'll explore the patterns and then apply them to develop a sample application. Along the way, we'll discuss implementation variations and the consequences of our decision. When finished, we'll have a simple but useful application that you can take home with you and easily extend with new functionality. This session is all pure Java, and you'll be able to apply the techniques you learn immediately. Be sure to bring a laptop.
Monolithic applications are difficult to understand, maintain, and extend with new features and functionality. Modularity helps achieve these goals. Unfortunately, few applications have been designed with modularity in mind. In this workshop, we take a deep dive into modularity.
In part 2, we'll finish the exercise we began in part 1. We'll continue applying several of the patterns of modular architecture. Upon completing the application, we'll have a short retrospective to discuss the consequences of our design decisions. To wrap up, we'll explore how using a framework (OSGi) that supports modularity extends the benefits of our modular architecture over to runtime without impeding our ability to leverage our modules directly atop standard Java. Be sure to bring a laptop.