A little old lady once challenged a well-known scientist’s explanation on the structure of the universe, countering that the world is really a flat plate supported on the back of a giant tortoise. The scientist rebutted the little old lady’s challenge with one of his own by asking what the tortoise was standing on. The little old lady’s sly reply was that it’s, “turtles all the way down.” So too is software architecture “turtles all the way down”.
In this session, we cover a broad range of topics that include challenging traditional practices of software architecture, examining what it takes to bring down the ivory tower, probing the paradoxical aspects of architecture’s goal, and investigating the inextricable link between temporal decisions and structural flexibility. From the highest level applications and services to the code that exists in the bowels of the system, and everything in between, we explore how an effective software architecture must be turtles all the way down. In the end, we will all have gained deep insight to the value of agile architecture.
Modularity is coming to the Java platform! Java 8 will introduce the Jigsaw module system. OSGi is here today. Don’t wait to start designing modular software. Contrary to popular belief, you don't need a framework or a new runtime to start building modular software applications. You can start today. Learn how!
In this session, we'll examine what it means to develop modular software on the Java platform. We'll examine the goals and benefits of modular software, and explore the patterns of modular architecture that help us develop modular software systems. With just a few easy steps, we'll see how to transform our software from a huge monolith to an extensible system of collaborating software modules. By examining an existing software system, we'll see first hand how we can increase software modularity with minimal disruption. You'll walk away not just with a much deeper understanding of the benefits of modular software, but also a migration roadmap for refactoring existing applications to increase their modularity. In other words, you'll see how to get ready today for the application platform of tomorrow.
OSGi was once heralded as a contender for most important technology of the decade. Today, most developers have heard of OSGi, but few are using it to develop their enterprise software applications.Is OSGi failing? Who is using it? And what exactly are its benefits?
In this session, we'll explain the benefits of OSGi, and show that it's not just for the middleware vendor. We'll learn how you can use OSGi without making significant changes to how you write your software applications. We'll explore the OSGi ecosystem, including platforms that support OSGi. Through code examination, we'll see how the Spring framework allows us to leverage OSGi in a non-invasive way. We'll discover how OSGi encourages Polyglot programming on the Java platform. And we'll take a brief glimpse into the future of modularity on the Java platform. You'll walk away with a much better understanding of OSGi, its strengths and benefits, how to use it effectively, as well as the myths surrounding its use.
The modularity patterns provide us with proven design techniques to develop a modular software architecture that is extensible, reusable, maintainable, and adaptable. In this session, we’ll explore 9 of the 18 modularity patterns.
This session introduces and examines the following patterns:
The modularity patterns provide us with proven design techniques to develop a modular software architecture that is extensible, reusable, maintainable, and adaptable. In this session, we’ll explore the remaining 9 modularity patterns.
This session introduces and examines the following patterns:
Getting software released to users is often a painful, risky, and time-consuming process. This workshop sets out the principles and technical practices that enable rapid, incremental delivery of high quality, valuable new functionality to users. Through automation of the build, deployment, and testing process, and improved collaboration between developers, testers and operations, delivery teams can get changes released in a matter of hours–sometimes even minutes–no matter what the size of a project or the complexity of its code base.
In this workshop we take the unique approach of moving from release back through testing to development practices, analyzing at each stage how to improve collaboration and increase feedback so as to make the delivery process as fast and efficient as possible. At the heart of the workshop is a pattern called the deployment pipeline, which involves the creation of a living system that models your organization's value stream for delivering software. We spend the first half of the workshop introducing this pattern, and discussing how to incrementally automate the build, test and deployment process, culminating in continuous deployment.
Getting software released to users is often a painful, risky, and time-consuming process. This workshop sets out the principles and technical practices that enable rapid, incremental delivery of high quality, valuable new functionality to users. Through automation of the build, deployment, and testing process, and improved collaboration between developers, testers and operations, delivery teams can get changes released in a matter of hours–sometimes even minutes–no matter what the size of a project or the complexity of its code base.
In this workshop we take the unique approach of moving from release back through testing to development practices, analyzing at each stage how to improve collaboration and increase feedback so as to make the delivery process as fast and efficient as possible. At the heart of the workshop is a pattern called the deployment pipeline, which involves the creation of a living system that models your organization's value stream for delivering software. We spend the first half of the workshop introducing this pattern, and discussing how to incrementally automate the build, test and deployment process, culminating in continuous deployment.
Getting software released to users is often a painful, risky, and time-consuming process. This workshop sets out the principles and technical practices that enable rapid, incremental delivery of high quality, valuable new functionality to users. Through automation of the build, deployment, and testing process, and improved collaboration between developers, testers and operations, delivery teams can get changes released in a matter of hours–sometimes even minutes–no matter what the size of a project or the complexity of its code base.
In the second half of the workshop, we introduce agile infrastructure, including the use of Puppet to automate the management of testing and production environments. We'll discuss automating data management, including migrations. Development practices that enable incremental development and delivery will be covered at length, including a discussion of why branching is inimical to continuous delivery, and how practices such as branch by abstraction and componentization provide superior alternatives that enable large and distributed teams to deliver incrementally.
Getting software released to users is often a painful, risky, and time-consuming process. This workshop sets out the principles and technical practices that enable rapid, incremental delivery of high quality, valuable new functionality to users. Through automation of the build, deployment, and testing process, and improved collaboration between developers, testers and operations, delivery teams can get changes released in a matter of hours–sometimes even minutes–no matter what the size of a project or the complexity of its code base.
In the second half of the workshop, we introduce agile infrastructure, including the use of Puppet to automate the management of testing and production environments. We'll discuss automating data management, including migrations. Development practices that enable incremental development and delivery will be covered at length, including a discussion of why branching is inimical to continuous delivery, and how practices such as branch by abstraction and componentization provide superior alternatives that enable large and distributed teams to deliver incrementally.
A Technology Radar is a tool that forces you to organize and think about near term future technology decisions, both for you and your company. This talk discusses using the radar for personal breadth development, architectural guidance, and governance.
ThoughtWorks Technical Advisory Board creates a “technology radar” twice a year, a working document that helps the company make decisions about interesting technologies and where we spend our time. ThoughtWorks then started conducting radar-building exercises for our clients, which provides a great medium for technologists company-wide to express their opinions about the technologies they use every day. For companies, creating a radar helps you document your technology decisions in a standard format, evaluate technology decisions in an actionable way, and create cross-silo discussions about suitable technology choices. This session describes the radar visualization and how to conduct a radar building session for yourself. After a brief introduction, the bulk of the workshop consists of attendees building a radar for the group, following the same procedure you'll use when you do this exercise at your company. At the end, we'll have created a unique Radar for this event and practiced doing it for yourself.
Although Agile has proven to provide incredible benefits in software development and delivery, it is not foolproof, nor a “Silver Bullet.” Plenty of factors need to be considered before attempting this highly disciplined approach.
Learn from the mistakes other organizations have made and discover which pitfalls to avoid to ensure that your first attempt at applying an Agile approach will be met with a successful outcome. This hour-long web seminar will explore these areas and provide clear steps your team and organization should consider to provide a clear set of tools to maximize the opportunity for best results possible.
Some come to Agile assuming it involves less discipline than their traditional methods, but this is a misperception. Today, the need for discipline in software development is greater than it ever was. Agile answers that need, arriving at discipline through the Team. Agile Teams must collaborate to develop strong discipline in both planning and execution.
We'll discuss how teams can obtain Agile discipline to achieve one of our core principles of delivering “working software” frequently. We'll explore some of the key Agile planning and engineering practices like continuous planning, Test-Driven Development, Continuous Integration and Acceptance testing. We'll look at the discipline involved in these practices, their inter-relationship, and the benefits they realize in delivering value to the customer.
Technical Debt can creep up on a project very quickly and ultimately create a technical crisis. Sonar can help you see how far gone your project may be and if you are continuing to head toward a crisis.
In this workshop we will discuss how the simple act of making technical debt visible to the developer, team, manager and organization can have a positive affect on the reversal of technical debt.
We will look at how to measure & visualize the seven axes of code quality:
Architecture & Design
Comments
Complexity
Coding Rules
Duplications
Potential Bugs
Unit Tests
We will show how to visualize measures these seven axes of code quality. These measures can help identify where to focus our limited time and attention to make the biggest impact on technical debt.
Used appropriately these measures can drive powerful conversations on reversing the current negative trends in your technical debt.
Where do defects come from? Technical debt is often one of our biggest challenges as poor design and defects are built up over time by cutting a corners here and there. We will discuss some key technical metrics that can shine light on these defects before they get out of control and find those that are out of control and worth your attention now.
We'll explore how the psychological effect of simply measuring technical metrics and making them visible can have immense impacts on reducing the occurences going forward.
By now, we are all familiar with the new orthodoxy: the product owner discerns the needs of the customer and feeds them to developers in the form a prioritized backlog. Developers pull work from that backlog, always confident that they're working on the highest-priority feature at the moment, and never having to worry about how those priorities are allocated. This system is simple, efficient, and has helped many teams function better than they used to.
Shakespeare wrote, “The first thing we do, let's kill all the lawyers.” It might be time to apply this aphorism to product management.
A few revolutionary companies are experimenting with the idea that developers should be in charge not only of when they build new features, but what features to build. Rather than mere code technicians following the will of a product and marketplace expert, developers themselves become experts in their product domain, building the tools users need—by conceiving of those tools themselves. Dispensing with the product owner creates an entirely new organizational tenor: one in which everyone is encouraged to master the business's domain, to organize their work in autonomous ways, and to take ownership of the purpose for which the organization exists.
Come ready to hear ground-breaking ideas and engage in group discussion about how these ideas might be put into practice in your workplace.
After almost a decade and several significant releases, Spring has gone a long way from challenging the then-current Java standards to becoming the de facto enterprise standard itself. Although the Spring programming model continues to evolve, it still maintains backward compatibility with many of its earlier features and paradigms. Consequently, there's often more than one way to do anything in Spring. How do you know which way is the right way?
In this session, we'll explore several ways that Spring has changed over the years and look at the best approaches when working with the latest versions of Spring.
In this presentation, we'll see how to use Spring to create, secure, streamline, hyperlink, and consume REST APIs.
In modern applications, there are a diverse array of clients consuming content from the web. Each of these clients has unique capabilities and limitations, therefore demanding presentation of the application to be tailored to each device. As a result, presentation logic is often pushed into the client itself, leaving the application to serve a common data-oriented lightweight API to be consumed by each client.
In this session, we're going to combine the magic of Spring Boot and the magic of Spring Data to yield something even more powerful. You'll see how to quickly build an application's persistence layer, whether it stores data in a RDBMS, Mongo, Neo4j, or several other popular data stores. You'll also see how to create a functioning REST API with nothing more than an interface and a domain type.
Spring Boot dramatically simplifies application development with Spring. But before Spring Boot came along, Spring Data was already making developers' lives easy when it comes to working with data. When combined, Spring Data and Spring Boot can make data persistence the easiest part of your application.
In this session, we'll survey a handful of popular JavaScript libraries, frameworks, and tools that make developing rich applications in the browser a snap. Among those considered will be Spine, Backbone, Sammy, and Knockout, seeing how each stacks up to each other and (in some cases) how they can even be used together.
The pendulum has swung. For years we have developed round-trip web applications where the bulk of processing and content generation are performed server-side and simply displayed in a web browser. However, modern applications target a variety of clients and it's becoming increasingly important for the browser to handle more than content rendering and simple form validation. These applications leverage the capabilities of modern browsers to provider a richer and more responsive experience.
In modern applications, Javascript is increasingly prevalent both on the client-side and to some degree on the server-side. As we continue to crank out more Javascript code, we're finding that many of the same hard-lessons we learned in writing decoupled Java code are equally desirable in Javascript code. Without the benefit of dependency injection and AOP, both Java and Javascript code can quickly become an unnavigable and untestable mess.
Where frameworks like Spring have helped us gain control over our Java code, Cujo.js similarly aims to give our Javascript code more structure and testability.
In this session, we'll look at Cujo.js, an “unframework” that provides dependency injection that takes Javascript's unique needs into consideration to create loosely-coupled code. We'll also see how, although Cujo.js isn't strictly a UI framework, elements of Cujo.js can be brought together to elegantly build client-side UIs.
Git is a version control system you may have been hearing a bit about lately. But simply hearing more about it may not be enough to convince you of its value. Getting hands on experience is what really counts. In this workshop, you'll bring your Windows, Mac or Linux laptop and walk through downloading, installing, and using Git in a collaborative fashion.
The workshop style of this class will allow you to observe and discover the value of this new version control tool first hand. You'll be cloning, creating, commiting, and pushing repositories by the conclusion of this session.
PreReq:
Basic knowledge of a version control system. Subversion knowledge is a plus, but not imperative.
So you've gotten a handle on Git and know how to use it for everyday development tasks like committing code and pushing and pulling changes with the rest of the team. But do you really know how it works under the covers? In this brief demonstration, we'll commit a file to a brand new repository without ever touching the git add or git commit commands, and in the process learn some critical Git internals that every power user should know.
We'll also take a look at some advanced history and undo commands like reflog and reset, and how to rewrite past mistakes with interactive rebase. Bring your questions and Git challenges for 90 minutes of advanced Git fun!
Gradle is a compelling new build tool that incorporates the lessons learned from a decade of Ant and Maven. More than just a compromise between declarative and imperative build formats, or between convention and configuration, Gradle is a sophisticated software development platform that simple builds easy and complex, highly automated continuous software delivery pipelines possible to build. Using its extensible APIs and expressive DSL, you're equipped to build your next build.
Bring your laptop to this session for the following:
What do you need to know about combinatorics, number theory, and the underpinnings of public key cryptography? Well, maybe more than you think!
In this talk, we'll explore the branch of mathematics that deals with separate, countable things. Most of the math we learn in school deals with real-valued quantities like mass, length, and time. However, much of the work of the software developer deals with counting, combinations, numbers, graphs, and logical statements: the purview of discrete mathematics. Join us for this brief exploration of an often-overlooked but eminently practical area of mathematics.
Robert Martin assembled the SOLID family of principles to provide a useful guide to help us create object-oriented software designs that were resilient in the face of change. In recent years, the need to write highly-concurrent software in order to leverage increasingly ubiquitous multicore architectures, as well as general interest in more effectively controlling complexity in large software designs, has driven a renewed interest in the functional programming paradigm. Given the apparent similarity in their goals, “What is the intersection of SOLID with functional programming?” is a natural question to ask.
In this talk, we'll explore this intersection. We'll begin with a tour of the evolutionary patterns associated with enterprise software and programming paradigms, as well as take a look at the ongoing quest for best practices, the goal being to elucidate the motivation for examining this intersection of SOLID and functional programming. We'll then walk through each of the SOLID principles, examining them in their original object-oriented context, and looking at example problems and solutions using the Java language. Then for each principle, we'll examine its possible intersection with the functional programming paradigm, and explore the same problems and solutions using the Clojure language. We'll close by examining the transcendent qualities of the SOLID principles and how they can make any design simpler, regardless of the programming paradigm employed.
For much of the last two years I've delivered a two-part series at NFJS shows entitled “Effective Java Reloaded.” For all pracical purposes, it is an ala carte style rehash of the book Effective Java, written by Josh Bloch. One of my favorite parts of the discussion is of Item #15, which tells us to “Minimize Mutability.” If we turn this inside out, we're actually saying that we want to MAXIMIZE IMMUTABILITY. When we do this, we reap many benefits, such as code that is easier to reason about and that is inherently thread-safe. This can carry us a long way in the direction of program correctness and decreased complexity. However, when we start to program with immutability, several major questions arise.
First, the necessity of using a separate object for each distinct value, never reusing, or “mutating” an object, can quickly cause performance concerns. These concerns are amplified when we're talking about large collections such as lists and maps. These problems are largely solved by what we call “persistent data structures.” Persistent data structures are collections from which we create new values, not by copying the entire data structure and apply changes, but by creating a new structure which contains our changes but points at the previous structure for those elements which have not changed. This allows us to work with data structures in a very performant way with respect to time and resource consumption. We'll examine persistent data structures, their associated algorithms, and implementations on the JVM such as those found in the TotallyLazy library.
Second, because all of an immutable object's state must be provided at the time of construction, the construction of large objects can become very tedious and error prone. We'll examine how the Builder pattern can be applied to ease the construction of large objects, and we'll examine Builder implementations in Java and Groovy.
Third, we run into problems when we start to use frameworks that expect us to program in a mutable style. A prime example is Hibernate, which expects our persistent classes to follow the well-worn JavaBean convention, including a no argument constructor and getters and setters for each property. Such a class can never be mutable! So how do we program with frameworks such as Hibernate and yet still minimize mutability? The key is found in not letting frameworks dictate the way that you design your code. Just because the framework require something, don't let it force you to make the wrong decision. Use the framework as a tool to write your code, don't let your code be a tool of the framework. We'll examine strategies for doing exactly that.
You should come away from this talk better equipped to program in a way that minimizes mutability and maximizes immutability.
Even with the recent explosion in alternative languages for the JVM, the vast majority of us are still writing code in “Java the language” in order to put bread on the table. Proper craftsmanship demands that we write the best Java code that we can possibly write. Fortunately we have a guide in Joshua Bloch's Effective Java.
In his foreward to the first edition, Guy Steele writes about the importance of learning three aspects of any language: grammar, vocabulary, and idioms. Unfortunately many programmers stop learning after mastering the first two. Effective Java is your guide to understanding idiomatic Java programming.
Effective Java is organized into 78 standalone “items,” all of which will be impossible to cover in one session. Instead I've chosen a subset of the most important techniques and practices that are commonly missed by today's Java programmers. You'll pick from a menu and decide where we'll head. Regardless of the path we take, you'll leave this session thoroughly equipped to write better Java code tomorrow!
There is a good amount of excitement about the new version of Java. The big
evolution of course is the lambda expressions. In this presentation we will dive into the language features in Java 8, take a look at some of their nuances, and look at ways to put them to good use.
Java 8 language capabilities and application.
JavaScript has a mixed heritage: OO and Functional. To date, us developers have focused on the OO side of JavaScript and not much mind-share has been given to the other, more powerful side. In this session we'll explore how to use the power of functional Javascript.
JavaScript has elements of two distinct programming languages: Self and Scheme. These two languages are very different - and some of JavaScript's weirdness is due to this mixing of very different language designs. The conceptual models are also very different between Self and Scheme - one is a prototypical object based language, while the other is a functional language. In this session, we'll discuss the elements of how the Scheme functional programming language manifest in JavaScript. We're going to explore how you can write JavaScript in a more elegant and powerful way by applying functional concepts.
So you think you've picked up enough JavaScript to be dangerous, but feel like the whole prototypical language thing is still a mystery. In this session, we'll go from basic JavaScript to advanced JavaScript. We'll discuss and code modular JavaScript with CommonJS. We'll look into the details of a prototype language and discuss things like parasitic inheritance. We'll also look at JavaScript libraries that will help you get the most out of JavaScript - not jQuery, but a library like UnderscoreJS and SugarJS.
This is a fast paced session meant to bring you up to speed with the latest and greatest JavaScript techniques and tools. Whether you're building client side JavaScript with HTML5 or Appcelerator Titanium, or server-side JavaScript with node.js, you'll come away with knowledge and patterns for how the pro's use JavaScript for building real apps.
We've come a long way down the JavaScript road. Gone are the days of 'just hack it' for the web - architecting even a small project in JavaScript can be a challenge. Thankfully, there are several frameworks to help you; the most popular currently is Backbone.js.
Before you start using a framework in JavaScript, you will want to understand the techniques expert JavaScript programmers use to build them. In this session, we'll dive into design patterns in JavaScript, and do live coding so you can see these patterns applied. Even if you're not using a framework, you can use these design patterns to make your code more maintainabile, elegant, and concise.
The Single Page App, or SPA, require the developer thinks about architecture in new ways compared to traditional server-side page generation web apps. Whether you've used Java web frameworks like Spring Web, or Struts, dynamic frameworks like Grails or Rails, or even Django or PHP, you will need to learn some new techniques for building SPA type applications. We discuss the architecture and design of Service Oriented Front End Architectures (SOFEA) in this session.
In this session, we'll look at structuring your app from both the client and server side. Using a SPA framework like Backbone.js or Ember can help, but does not answer critical design questions. This is a fast paced session where we'll talk about SPAs, REST, SEO, and much more.
Ever wish you could use your JavaScript-foo to build a NATIVE mobile app? Wish there was an open-source platform that would let you build awesome cross-platform mobile apps? Come to this session and learn about Titanium, an open-source, JavaScript based platform for creating native mobile apps.
Titanium is an open-source development tool for producing cross-platform mobile applications by Appcelerator. Using Titanium, you develop your mobile application using Javascript coded against the Titanium API's. The Titanium platform invokes their builder to take your Javascript and build a native application for iOS and Android.
This session will walk you through the details of building great apps for the Android and iOS platforms. We'll talk about Titanium development, its ecosystem, and architecture. We'll spend time looking at lots of code - we'll build an app, in fact, while we discuss and explore the framework. We'll also spend some time discussing best practices, what to expect when developing against it, and the limits of this type of development.
Languages offer a lot more than syntax and compilers. They often have supporting libraries and special facilities that set them apart from other languages. Some languages offer special compiler support for a particular construct, like tail call optimization, for example. Others provide interesting library support or capabilities.
In this presentation we will dive into 12 cool things we can do with different languages on the JVM, things that are either impossible or hard to do in Java, but are quite easy and useful to realize in other popular languages on the JVM. If mixing these languages is an option on your projects, you'll have dozen more reasons after this presentation.
Have you looked into Scala? Scala is a new object-functional JVM language. It is statically typed and type inferred. It is multi-paradigm and supports both object oriented and functional programming. And it happens to be my favorite programming language.
If you are interested in Scala, how you are planning to learn Scala? You probably are going to pick up a book or two and follow through some examples. And hopefully some point down the line you will learn the language, its syntax and if you get excited enough maybe build large applications using it. But what if I tell you that there is a better path to enlightenment in order to learn Scala?
Scala Koans, a set of test cases that will teach you Scala language. The Scala koans will help the audience learn the language, syntax and the structure of the language through test cases. It will also teach the functional programming and object oriented features of the language. Since learning is guided by failing tests it allows developers to think and play with the language while they are learning.
Namaste,
For those planning to attend the Scala Koans…
Welcome to Scala Koans!
Scala Koans is an interactive session that puts the programming and learning in your hands. Therefore, a laptop is required by all participants. If you do not have a laptop, then perhaps you have a friend with a laptop, is so, well, that would work too. In order to participate in the Scala Koan endeavor, a few things are required:
The process of actually running the koans will be covered during the session. Unfortunately, Internet connectivity is sometimes a dicey affair and at times it can rain on our parade. To avoid having to wait for the install at the conference you can prepare for the koans before the conference! If you don't have the opportunity to do this, we will have either memory sticks or private networks at the conference.
If you want to get started with the set up:
Before attending the koans session, you may want to take the opportunity to load some Scala Plugins onto your favorite IDE and Editor. Below is a list of resources that you can use to enhance your environment so that you can enjoy Scala syntax highlighting and other helpful tools like refactoring, debugging and analysis.
Eclipse - The Eclipse has an IDE plugin for Scala called aptly scala-ide. All the information about the plugin can be found at http://scala-ide.org including an easy to follow along video located at http://scala-ide.org/docs/current-user-doc/gettingstarted/index.html
IntelliJ - IntelliJ has a Scala plugin that can be found by going to Settings -> Plugins, clicking on 'Browse Repositories' button and searching for the 'Scala' plugin on the left. Right click on the 'Scala' and choose 'Install'. IntelliJ will prompt you to restart the IDE, do so, and enjoy.
NetBeans - Currently, Github user 'dcaoyuan' hosts a NetBeans Scala plugin at the address: https://github.com/dcaoyuan/nbscala. I have not tried this out since the number of NetBeans users has shrunk in recent years. If you are an avid NetBeans user, and wish to try it, you can let me know the results during the session. There is additional information at: http://wiki.netbeans.org/Scala
Emacs - Github user 'aemoncannon' has created 'ENSIME' (ENhanced Scala Interaction Mode for Emacs) at the address and has a great following. https://github.com/aemoncannon/ensime with some documentation at http://aemoncannon.github.io/ensime.
VIM - For VIM users you can use https://github.com/derekwyatt/vim-scala as a VIM plugin that offers Scala color highlighting
That is it. Hope to see you soon.
Relational databases have ruled the world since the dawn of time (or so it appears). They power our enterprises and for many in the corporate world, it may be hard to imagine life without them. Each decade a novel idea would challenge the status quo and make a case to deviate for the tradition. A flock of enthusiastic programmers, like your humble speaker back in the early 90s, would throw their support around it, only to be crushed eventually by the large vendors and enterprise standards. But, the excitement around NoSQL has shown that enterprise data is not the only thing that's persistent.
In this presentation we'll learn how NoSQL deviates from those deep rooted traditional approaches, and how this may be useful. We will also discuss the situations where these types of databases may be more appropriate.
Scala is a statically typed, fully OO, hybrid functional language that provides
highly expressive syntax on the JVM. It is great for pattern matching,
concurrency, and simply writing concise code for everyday tasks. If you're a
Java programmer intrigued by this language and are interested in exploring
further, this section is for you.
We will go through a rapid overview of the language, look at its key strengths and capabilities, and see how you can use this language for your day-to-day programming. This session will be coding intensive, so be ready for some serious Scala syntax and idioms.
The basics of developing for the Android platform will be explored, from setting up the SDK to using the Android Studio IDE and the generated Gradle build files. No previous experience is required, other than a basic knowledge of Java.
After discussing how Android fits into the marketplace, we'll look at creating applications, how to use activities, and working with layouts.
Building on the the previous talk, we'll add intents, customized layouts for alternative configurations, talk about the activity lifecycle, use logging, and more.
We'll deploy to both emulators and connected devices, and change input styles.
Testing is well-established in the server-side Java ecosystem but is often an afterthought when it comes to Android development. This talk will review the available options and libraries used to do both unit and integration testing.
We'll look at tools that come with the SDK, as well as third-party tools for UI testing.
The Spock framework brings simple, elegant testing to Java and Groovy projects. It integrates cleanly with JUnit, so Spock tests can be integrated as part of an existing test suite. Spock also includes an embedded mocking framework that can be used right away.
In this presentation, we'll look at several examples of Spock tests and review most of its capabilities, including mock objects and integration with Spring.
This is the first in a new series on resource-oriented systems. The goal of the series is to provide practical guidance on the design and implementation of next generation systems that are flexible, extensible, high-performance and future-friendly. The talks are designed to work as arc, building upon each other, but they should also stand alone. The first topic is a guided walk through of building quality REST APIs.
We will focus on the architecture of the Web and how it can help us model and manipulate our important business concepts. We will discuss the role of stable identifiers, intentional representation design, hypermedia affordances and architectural consistency. The goal is not to be “RESTful”, the goal is to build systems that display the properties we require.
This talk will be accessible for people new to REST, but also different enough that those who have attended previous REST talks will learn new things.
This is the second in a new series on resource-oriented systems. The goal of the series is to provide practical guidance on the design and implementation of next generation systems that are flexible, extensible, high-performance and future-friendly. The talks are designed to work as arc, building upon each other, but they should also stand alone. This second talk is an introduction to the use of Semantic Web technologies to enable collaboration without coordination.
REST is a means to an end, but it is not a satisfactory end state. It usually pushes complexity to the client in ways that make data integration difficult across multiple sources. The W3C Semantic Web initiative introduces us to new technologies for linking resources and querying across them in powerful new ways. We will learn about the RDF model, what it brings to the table and how we can use it connect information regardless of where and how it is stored. We will use the SPARQL protocol and query language to ask powerful questions of arbitrary resources. We will also see how we can create new information just by asking for it.
This is the third in a new series on resource-oriented systems. The goal of the series is to provide practical guidance on the design and implementation of next generation systems that are flexible, extensible, high-performance and future-friendly. The talks are designed to work as arc, building upon each other, but they should also stand alone. This third talk will introduce you to RDFa, one of the most exciting technologies estimated to be used on at least 25% of the indexed Web.
We understand that documents contain information, but it is usually only accessible to humans if they know where and how to find them. What if we could automatically extract arbitrary information about arbitrary domains and connect it to information held elsewhere? What if we could use the information in a document to help us organize our content better? What if this embedded information could help external search engines index the public Web better and improve your rankings?
This talk will show you how to weave and extract information in HTML, XHTML and arbitrary XML using standard tools such as RDFa. In the process, you will learn how to free the information so that it may be reused in powerful and unanticipated ways.
If you're not terrified, you're not paying attention.
Publishing information as webs of data does not require us to just give it away. We have a series of tools and techniques for managing identity, authentication, authorization and encryption so we only share content with those we trust.
Before we tackle Web Security, however, we need to figure out what we mean by Security. We will pull from the worlds of Security Engineering and Software Security to lay the foundation for technical approaches to protecting our web resources.
In our industry, we have a problem. It's called the Software Problem. It is an embarrassing indictment of our capacity to deliver quality software on time and under budget. Beyond that, when we do deliver running code, it is often fragile and hard to extend. There are many reasons for this and many solutions. But one that does not get enough attention is how we approach information.
This is a theoretical discussion. You may not learn something you can use right away, but you may learn new ways of thinking about designing and building systems with an information-centric focus. We will discuss the roles of databases, services, software models, REST, the Web and the roles they all play together.
No matter where you slice software engineering:
The root cause of many, if not most problems, is the common absence of critical thinking in how we approach decision making. Instead of thinking critically about our engineering decisions, we often follow a Cargo Cult mentality or blindly follow the pronouncements of the Blowhard Jamboree. The end results all too often include suboptimal productivity, excessive spending, poor quality and cancelled projects.
When we think instead critically about a component of software engineering, we take it apart. We discard our presuppositions. We challenge tradition. We gather our own evidence. We question everything.
This talk will examine the pathologies associated with not thinking critically, including a tour of the antipatterns that can emerge from such a practice. We'll then walk through the concentric circles of the critical thinking process, including evidence evaluation, argument evaluation, and argument construction. You'll leave this session with a critical thinking framework which can be applied to software engineering as well as beyond.
Have you ever wished that your local development sandbox could look exactly like production, but you've got a mismatch between your local OS and your production OS? And what about the age old “it works on my machine” excuse that quite often stems from differences between developer sandboxes? Many have turned to virtualization, creating a machine image that can be passed around the team. But who manages the template? How do you keep things in sync?
In this session, we'll explore Vagrant (http://www.vagrantup.com), an open source tool that allows you to easily create and manage virtual development environments that can be provisioned on demand and “thrown away” when no longer needed.
Vagrant is most powerful when we think of it as a tool to enable various workflows that are useful to software development teams. In this talk, we'll walk through the following workflows and examine Vagrant's contributions:
Invoke dynamic is a Java 7 feature that had the most impact at the bytecode level and also in terms of performance. First perceived as a feature to help dynamically typed languages on the JVM, it turned into a powerful feature that has been exploited quite a bit in the implementation of features in the Java
language itself.
In this presentation we will understand what problem this features really solves and how it has influenced other features in the Java language and on the Java platform.
Even with the recent explosion in alternative languages for the JVM, the vast majority of us are still writing code in “Java the language” in order to put bread on the table. Proper craftsmanship demands that we write the best Java code that we can possibly write. Fortunately we have a guide in Joshua Bloch's Effective Java.
Effective Java is organized into 78 standalone “items,” all of which will be impossible to cover in one session. Instead I've chosen a subset of the most important techniques and practices that are commonly missed by today's Java programmers.
*In Part II of this session, we'll cover those items we were unable to reach during Part I. We'll follow that up with a dive into the new features available in Java 7, describing new idioms for effective Java programming in the following areas:
“That's the way we've always done things”, is a phrase commonly uttered over the course of a software development project. But often times, organizations have instituted governance and policy based on yesterday's practices. We continue with these dated policies without ever examining their origin and whether they are necessary or provide any true value today. These policies serve as gates, which often times impede progress. The story of Chesterton's Gate encourages us to ask “Why” something is necessary before we decide if it's beneficial to remove it. In this session, we examine several “gates” across several industries, including software development, and ask “Why” to determine if it's still needed.
In this session, we examine examples (many of them quite humorous) of Chesterton's Gate, including several from our world of software development.